Quantum computers pose a significant threat to blockchain technology's security, which heavily relies on public-key cryptography and hash functions. The cryptographic algorithms used in blockchains, based on large odd prime numbers and discrete logarithms, can be easily compromised by quantum computing algorithms like Shor's algorithm and its future qubit variations. This survey paper comprehensively examines the impact of quantum computers on blockchain security and explores potential mitigation strategies. We begin by surveying the existing literature on blockchains and quantum computing, providing insights into the current state of research. We then present an overview of blockchain, highlighting its key components and functionalities. We delve into the preliminaries and key definitions of quantum computing, establishing a foundation for understanding the implications on blockchain security. The application of blockchains in cybersecurity is explored, considering their strengths and vulnerabilities in light of evolving quantum computing capabilities. The survey focuses on the quantum security of blockchain's fundamental building blocks, including digital signatures, hash functions, consensus algorithms, and smart contracts. We analyze the vulnerabilities introduced by quantum computers and discuss potential countermeasures and enhancements to ensure the integrity and confidentiality of blockchain systems. Furthermore, we investigate the quantum attack surface of blockchains, identifying potential avenues for exploiting quantum computing to strengthen existing attacks. We emphasize the need for developing quantum-resistant defenses and explore solutions for mitigating the threat of quantum computers to blockchains, including the adoption of quantum and post-quantum blockchain architectures. By examining vulnerabilities and discussing mitigation strategies, we aim to guide researchers, practitioners, and policymakers in developing robust and secure blockchain systems capable of withstanding advancements in quantum computing technology.