Ordinary bits are either on, or off. With massive arrays of bits, computing is possible.
Q-Bits, used in Quantum Computing, can have one of an infinite number of states. Armed with only one Q-Bit, computing is possible.
What’s a “bit” anyway?
No, seriously, how is input and output to the REAL WORLD accomplished - furthermore -
- Given what you described, that isn’t “Quantum” (quantized, dealing with a specified or measured amount) computing, it’s back to ANALOG whose values represent an infinite number of states ...
Def - quanta - “in particular restrict the number of possible values of (a quantity) or states of (a system) so that certain variables can assume only certain discrete magnitudes.”