Large-scale text generation models have made significant strides in a variety of linguistic tasks, yet they often encounter limitations when tasked with producing highly creative or diverse outputs. Stochastic Semantic Drift (SSD) introduces a novel mechanism that injects controlled randomness into the semantic embeddings of a model, enhancing its ability to generate more varied and innovative text. Through the careful manipulation of token embeddings via stochastic drift, this method allows the model to explore broader conceptual spaces while maintaining fluency and semantic coherence. Extensive experiments demonstrate that SSD significantly improves diversity metrics, with the enhanced model outperforming its baseline counterpart in terms of lexical variety, conceptual novelty, and tonal diversity. Moreover, the results show that SSD preserves accuracy and fluency, offering an effective balance between creativity and linguistic correctness. The proposed approach opens new possibilities for improving generative tasks, allowing for more flexible and creative applications across diverse domains.