+18 votes
in Quantum Computing by
edited by

Your answer

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
+2 votes
by

Shor's algorithm is a quantum algorithm that was developed by Peter Shor in 1994. It is designed to efficiently factor large integers and solve the discrete logarithm problem, which have significant implications for cryptography.

The computational resources required by Shor's algorithm are typically measured in qubits rather than bits. In particular, Shor's algorithm requires a quantum computer with a sufficiently large number of qubits to achieve its full potential.

The number of qubits needed for Shor's algorithm depends on the size of the number being factored or the discrete logarithm being solved. The number of qubits required is roughly proportional to the logarithm of the input size.

For factoring an N-bit number, Shor's algorithm requires approximately 2N qubits. This means that if you wanted to factor a 2048-bit number, you would need a quantum computer with roughly 4096 qubits to run Shor's algorithm effectively.

It's worth noting that Shor's algorithm is a general-purpose quantum algorithm, and the specific number of qubits required may vary based on the implementation and the optimizations used. Additionally, the field of quantum computing is still rapidly evolving, and the current state of quantum hardware may differ from what was available at the time i finished college.

Welcome to Physicsgurus Q&A, where you can ask questions and receive answers from other members of the community.
...