Consider a sequence S of length n emitted by a Discrete Memoryless Source (DMS) with unknown distribution p. We wish to construct a lossless source code that maps S to a sequence S′ of minimal length m such that S′ approximates in terms of Kullback-Leibler divergence a sequence emitted by another DMS with known distribution q. Our main result is the existence of a coding scheme that performs such a task with an asymptotically optimal compression rate, i.e., such that the limit of m/n is H(p)/H(q) as n goes to infinity. Our coding scheme overcomes the challenges created by the lack of knowledge about p by relying on a sufficiently fine estimation of H(p), followed by an appropriately designed type-based compression that jointly performs source resolvability and universal lossless source coding. Our result recovers several previous results that either assume p uniform, or q uniform, or p known. The price paid for these generalizations is the use of common randomness with vanishing rate. We further determine that the length of the latter roughly scales as the square root of n, by an analysis of second order asymptotics and error exponents.