Michal's Profile




  • Asked on March 14, 2018 in Quantum Computing.

    Oh yes, they are analogies of ANALOG-computers, as you mention them.

    Yes, I also remember the days of making computing tasks more mechanically “bound” to the mass itself — using fluidic-computers and similar equipment.

    Yet, FORTRAN code still lives inside number crunching libraries until these days, often just a bit “hidden” under the hood of some more recent or popular interactive-“data”-provisioning layers ( Python / scikit-learn / scipy.optimiser tools / FMIN_L_BFGS_B() being executed on the end, still being in FORTRAN as one such example ).

    Guess you might love to hear Bo Ewald’s great talk on Quantum computing seminar, where he explains devices in a popular manner, which was held on the last HPC SuperComputing-2017. He speaks both about different Q-device architectures, about steps for an original pre-setting the Q-device energy-landscape to start the process of D-Wave quantum adiabatic-annealing machine and also mentioned some newer inputs from Los Alamos National Laboratory having people from LANL Quantum-computing research group also speaking there, such as reversed quantum-annealing, problem reformulation and has a few live-examples, so indeed worth to have a look, if interested.

    Video: D-Wave Systems Seminar on Quantum Computing from SC17

    • 3 answers
    • 1 votes
  • Asked on December 1, 2016 in Foundations.

    The page has  more than 3000 views recently and I decided respond to Andrew’s appeal and his question, raised in the beginning of August:

     Is there a form of computation that does not require counting?

    • fludic computers do not use counting ( analog computing systems originally developed for missile-based controls )
    • randomness-sources for cryptography ( where any form of “counting” would intrinsically mean a “predictable” non-randomness )

    Hope you forgive, Andrew, using a narrow-meaning of the form-of-computation, where you tried to evoke new ideas that could release our minds from “just counting”.

    Looking forward to further inspirations and kind surprises your views and new questions are sure to open.

    • 7 answers
    • 0 votes
  • Asked on July 11, 2016 in Foundations.
    nota bene:  after some magic, it comes to me as the very same surprise as it might come to you, but somehow it happened and today I see as if I were the author of this initial question ( who, out of any doubts, I am not, but Andrew MILNE is  ) , so  some contexts might look strange until admins will fix this ) (FIXED 🙂 the ADMIN)

    With all due respect, Andrew, the so called complexity classes ( ref. below + the whole Complexity-ZOO taxonomy ) are not related to anything else, but to a sequential-mode of processing. Turing Machine model was intentionally used for such classification, as there is some common ground for both theoretical proofs and for practical model-evaluation in both TIME and SPACE dimensions of the problem-under-review scaling ( a.k.a. computing complexity ).

    While there were some efforts to extend this SEQ-mode taxonomy in early stages somewhere in late 70-ies/ 80-ies into a direction of PAR-mode accompanions ( with proposals alike ||PTIME, denoting a polynomial time PTIME-complexity even for a PARALLEL-computing architecture system undertaking the non-SEQ processing ), the QUANTUM-computing architecture stays on much different grounds and both it’s SPACE and TIME dimensions seem to me, from the available research published so far, to be rather constant-bound a-priori ( fixed by the manufacturing constraints at the moment ).

    One might be delighted by a below cited research on a general possibility to overcome C2-class computer systems’ boundaries, where the last paragraph is my beloved one.

    Anyway — looking forward to read more news on this innovative research and it’s applied sciences impacts.

    • 7 answers
    • 0 votes
  • Asked on June 29, 2016 in Foundations.

    (cit.:)  “Section 4.3 ( translator’s notice: is there any chance for a giant leap ) BEYOND THE CLASS-2 COMPUTER BOUNDARIES [2]

    The main practical – though negative – implication of the previous thoughts is a fact that within the Class-2 computing, there is not to be expected any efficient solution for sequentially intractable problems.

    Nevertheless, a question raises here, whether some other sort of parallel computers could be imaginable, that would be computationally more efficient than Class-2 computers.

    Indications, coming from many known, conceptually different C2 class computer models, suggest that without adding some other, fundamental computing capability, the parallelism per se does not suffice to overcome C2 class boundaries, irrespective how we try to modify, within all thinkable possibilites, the architectures of such computers.

    As a matter of fact, it turns out, that C2 class boundaries will be crossed, if there would be a non-determinism added to an MIMD-type parallelism ( Ref. Section 3.5 ).

    Non-deterministic PRAM (*) can, as an example, solve ( intractable ) problems from NPTIME class in polylogarithmic time and problems of a demonstrably exponential sequential complexity in polynomial time.

    Because, in the context of computers, where the non-determinism is equally well technically feasible to be implemented as a clairvoyance, the C2 computer class seems to represent, from the efficiency point of view, the ultimate class for the parallel computers, the borders of which will never be crossed. 


    *) PRAM: a Parallel-RAM, not a SIMD-only processor, demonstrated by Savitch, Stimson in 1979 [1]

    [1] SAVITCH, W. J. – STIMSON, M. J.: Time bounded random access machines with parallel processing. J. ACM 26, 1979, Pg. 103-118.
    [2] WIEDERMANN, J.: Efficiency boundaries of parallel computing systems. ( Medze efektivnosti paralelných výpočtových systémov ). Advances in Mathematics, Physics and Astronomy ( Pokroky matematiky, fyziky a astronomie ), Vol. 33 (1988), No. 2, Pg. 81–94

    • 7 answers
    • 0 votes
  • Asked on May 11, 2016 in Foundations.

    Ad 4) In case you are seriously devoted into QF, rather skip this option

    Ad 3) Location is not the key per se, one may review published research papers on innovative approaches & advanced topics to see, where QF research yields most of the accepted publications.

    Ad 2) One may consider this as an advantage, rather than a “weakness”. Most top-rated QF jobs require non-financial specialisations — typically Physics, CS, other sciences with strong practice in quantitative subjects and rigorous methods of cross-validations.

    Ad 1) This would best be answered by your future PhD tutor, so do not hesitate to be pro-active and get in touch with research colleagues from Faculty of your choice ( ref.: 3 ) and be open in your research interests and PhD plans to discuss their respective possibilites, incl. EU / ERASMUS / dual-Faculty degree options.


    As far as QC is concerned, the question would be lot harder to address. as discussed in “What is the best academic background for learning about QC?” 


    Plus plus + David,

    with your fields of study, you might also benefit a lot from QC in non-financial segments with high-impacts expected from the new possibilities brought from QC .. I mean the advanced research in new special materials design, molecular engineering based medical / drugs research & advanced protein dynamics et al, so do not be surprised once you may be head-hunted for this very particular mix of advanced knowledge.

    Keep Walking, David!

    • 5 answers
    • 0 votes
  • Asked on May 11, 2016 in Foundations.

    RE: Complexity Classes: Are we on the cusp of change?

    If one strives for “A Big-O explained in plain English”, the following direction may help to show both a high-level abstract and a mathematical addendum.

    Diversity is something very different ( and very needed in biologically inspired systems ( alike GP and other forms of self-evolving evolutionary programming ) ).

    Problem computability / decideability are another classification point of view onto the landscapes of advanced challenges, the frontiers of our civilisation’s knowledge face and QC may help with.

    ( well, hope we have much more interesting community lives, than bacteriae do ( at least I had never a phone call with any such, well, so far … ) so as the classic says … only the time will tell … )

    • 7 answers
    • 0 votes
  • Asked on May 11, 2016 in Quantum Computing.

    With a firm belief, the QC has a solid science, experimental  and engineering proven grounds, the major hurdle I feel in the TRANSLATION of the problem domain into a Quantum-fabric process-able DUAL, with a programmable inital state, after which one can “just” wait adequately long ( short ) for a QC-answer to the asked configuration.

    This is THE MAGIC, for which there is not much education sources avaliable at all. Collin Williams has touched the “surface” of this vast sub-space and a lot will have to happen here so as to get ready to keep pace with possibilities the QC has brought to us.


    If just cit.:  “electrical engineering is a pre-requisite to use digital computers” is true, my grandmother were an electrical engineer.


    A fair note would be, that in case the verb “use” would be replaced with something like “design”, I would agree with no other doubts.

    For QC, there is a green-field problem. How to educate someone on the grounds, that are far new and fairly fast self-evolving, so that even Masters are new to these fields?

    Learning about QC” is rather a life-long mind-set, while “being able to operate a quantum computer” seems to be a non-core issue of keeping the operating conditions for the given QC implemented technology within a range of acceptable operating conditions.

    The biggest issue is the transition of thinking from Classical Software Engineering into the Quantum Computing.

    If one insight may help in this, let me remind one wisdom from similar shift-of-paradigm, which has already happened ( and which has brought a quite poor results, once observed from ex-post point of view ).

    Practitioners from PARALLEL-hardware computing have experienced, that Classical Software Engineering has provided much worse prepared technicians, from their respective education programmes, than that of Electronics ( analog circuits ) and have attributed this to a natural imagination and practical belief and feeling that “things happen immediately in time && system-wide, the circuit design & engineering in it’s Electronics expertise has built on, which is by far much closer to the PARALLEL code-execution, than any re-educated Software Engineer ( who has so far met just all SERIAL, sequentially executed code-base & has gained a design knowledge-base that has been re-developed, re-educated / was all the time practising the very contrary experience ( or well, an opportunistically concurrent, at most, if perseverly innovative enough to carry all the pains the concurrency brings without a-priori pi-systems’ planning or another ad-hoc deterministic scaffolding ).

    This industry originated experience has raised a warning ( only, pitty for that, it is more than 40 (!) yes, fourty years missed on this ) and there are not much visible progress so far in true-parallel domain ( with the latest insights laid down in as early as in 70-ies, 80-ies in pi-systems, occam-pi, CSP, lambda-calculus and similar principal paradigms for fully deterministic, highly-compact, true-parallel-execution systems’ design and programming ( yes, not considering a jungle of C-based #pragma decorators injected into still a sequential-code syntactical constructors a valid, third-millenium adequate sign of progress in “just” true-PARALLEL software design in “still” classical computing ).

    ( Even the worse for QC, which requires a much bigger and couragefull shifts-of-mind, breath-taking jumps into thinking about non-Abelian topological degrees of freedom and many new horizons to overcome, but we may learn a lot from this PARALLEL parallel, not to waste the next four decades … again )

    • 3 answers
    • 1 votes
  • Asked on May 11, 2016 in Foundations.

    With all due respect to the underlying Quantum Mechanics model of the Nature, the |LATE> and |NOT LATE> is not mutually exclusive in real, classical ( non-quantum ) world, if for no other reason, due to a parallel multi-level fractality, driven within an externally given seasonality of the market dynamics.

    • 1 answers
    • 0 votes
  • Asked on May 11, 2016 in Foundations.

    With a hope no one would consider this impolite and/or harmfull, let me cite a lovely point of view on the underlying assumption from Correlation does not imply Causation“: that many contemporary Quantitative Finance -modellers neglect or abstract from:

    For any two correlated events, A and B, the following relationships are possible:

    • A causes B; (direct causation)
    • B causes A; (reverse causation)
    • A and B are consequences of a common cause, but do not cause each other;
    • A causes B and B causes A (bidirectional or cyclic causation);
    • A causes C which causes B (indirect causation);
    • There is no connection between A and B; the correlation is a coincidence.

    Thus there can be no conclusion made regarding the existence or the direction of a cause-and-effect relationship only from the fact that A and B are correlated.

    • 3 answers
    • 0 votes
  • Have opened a can of worms here, indeed, Marcos.

    As a good yard-stick to compare with ( in a quantitative manner, thus having a feeling of years, well, rather 10^XYZ years, for the realistic TimeDOMAIN-costs of any such solution ),

    let me recommend to follow a fabulous lecture of Matthias Troyer ( ETH Zurich, [CH] ) on this very subject.

    • 1 answers
    • 0 votes