What is the best academic background for learning about QC?  

Just as electrical engineering is to a pre-requisite to use digital computers, quantum physics is hardly essential to use quantum computers. So what fields provide the best background for being able to operate a quantum computer?



Add Comment
3 Answer(s)
The best fields to study to prepare one to program the type of QC that D-Wave makes would be …
– mathematics (especially discrete mathematics)
– discrete combinatorial optimization
– heuristic algorithms
– artificial intelligence
– machine learning
– spin physics
– quantum annealing
Add Comment

As a starting point, here’s a recommendation for “Software Developer Education” from collegegrad.com (randomly chosen from Google search):

Software developers usually have a bachelor’s degree, typically in computer science, software engineering, or a related field. A degree in mathematics is also acceptable. Computer science degree programs are the most common, because they tend to cover a broad range of topics…   Software developers also need skills related to the industry in which they work. Developers working in a bank, for example, should have knowledge of finance so that they can understand a bank’s computing needs.

So perhaps this is also a question for the high school student looking at a computer science degree:  how many courses in quantum computing does the faculty offer?  Are they taught by researchers active in the field?

Answered on May 4, 2016.
Add Comment

With a firm belief, the QC has a solid science, experimental  and engineering proven grounds, the major hurdle I feel in the TRANSLATION of the problem domain into a Quantum-fabric process-able DUAL, with a programmable inital state, after which one can “just” wait adequately long ( short ) for a QC-answer to the asked configuration.

This is THE MAGIC, for which there is not much education sources avaliable at all. Collin Williams has touched the “surface” of this vast sub-space and a lot will have to happen here so as to get ready to keep pace with possibilities the QC has brought to us.


If just cit.:  “electrical engineering is a pre-requisite to use digital computers” is true, my grandmother were an electrical engineer.


A fair note would be, that in case the verb “use” would be replaced with something like “design”, I would agree with no other doubts.

For QC, there is a green-field problem. How to educate someone on the grounds, that are far new and fairly fast self-evolving, so that even Masters are new to these fields?

Learning about QC” is rather a life-long mind-set, while “being able to operate a quantum computer” seems to be a non-core issue of keeping the operating conditions for the given QC implemented technology within a range of acceptable operating conditions.

The biggest issue is the transition of thinking from Classical Software Engineering into the Quantum Computing.

If one insight may help in this, let me remind one wisdom from similar shift-of-paradigm, which has already happened ( and which has brought a quite poor results, once observed from ex-post point of view ).

Practitioners from PARALLEL-hardware computing have experienced, that Classical Software Engineering has provided much worse prepared technicians, from their respective education programmes, than that of Electronics ( analog circuits ) and have attributed this to a natural imagination and practical belief and feeling that “things happen immediately in time && system-wide, the circuit design & engineering in it’s Electronics expertise has built on, which is by far much closer to the PARALLEL code-execution, than any re-educated Software Engineer ( who has so far met just all SERIAL, sequentially executed code-base & has gained a design knowledge-base that has been re-developed, re-educated / was all the time practising the very contrary experience ( or well, an opportunistically concurrent, at most, if perseverly innovative enough to carry all the pains the concurrency brings without a-priori pi-systems’ planning or another ad-hoc deterministic scaffolding ).

This industry originated experience has raised a warning ( only, pitty for that, it is more than 40 (!) yes, fourty years missed on this ) and there are not much visible progress so far in true-parallel domain ( with the latest insights laid down in as early as in 70-ies, 80-ies in pi-systems, occam-pi, CSP, lambda-calculus and similar principal paradigms for fully deterministic, highly-compact, true-parallel-execution systems’ design and programming ( yes, not considering a jungle of C-based #pragma decorators injected into still a sequential-code syntactical constructors a valid, third-millenium adequate sign of progress in “just” true-PARALLEL software design in “still” classical computing ).

( Even the worse for QC, which requires a much bigger and couragefull shifts-of-mind, breath-taking jumps into thinking about non-Abelian topological degrees of freedom and many new horizons to overcome, but we may learn a lot from this PARALLEL parallel, not to waste the next four decades … again )

Answered on May 11, 2016.
Add Comment

Your Answer

By posting your answer, you agree to the Terms & Privacy policy.