Gerhard Stenzel, Michael Kölle, Tobias Rohe, Maximilian Mansky, Jonas Nüßlein, and Thomas Gabor
Quantum machine learning offers novel paradigms to address limitations in traditional natural language processing models, such as fixed context lengths and computational inefficiencies. In this work, we propose QMamba, the first quantum adaptation of the Mamba architecture, integrating selective state space models with quantum computation for efficient and scalable text generation. QMamba leverages quantum principles like superposition and entanglement to enable unbounded context sizes and reduced computational complexity. Our contributions include the development of a quantum generative model optimized for hardware constraints, advancements in encoding, embedding, and measurement techniques, and the demonstration of its performance on pattern reproduction and context-challenging tasks like ”Needle in a Haystack.” Experimental results confirm QMamba’s potential to maintain high efficiency and performance across varying sequence lengths, laying the groundwork for future explorations in quantum-enhanced natural language processing.