How interpreters work?

Discussion in 'Computer Programming, Emulation, and Game Modding' started by EthanAddict, Dec 15, 2016.

  1. EthanAddict
    OP

    EthanAddict Founder of Skiddon't-ism

    Member
    433
    1,927
    Nov 12, 2016
    Greece
    I am currently studying computers as a hobby and I haven't understand the steps an interpreter follows. I understand the basic idea, but the more advanced stuff confuse me a bit.
     
  2. FAST6191

    FAST6191 Techromancer

    pip Reporter
    23,361
    9,153
    Nov 21, 2005
    When you say more advanced stuff are you talking about things like just in time compilation vs the almost state machine operations of the basic thing or very in depth specifics somewhere?

    Anyway this is going to involve a trip into the lowest levels of computing, which is amusing given interpreters are tools designed to avoid having to think about all that.
    Computers have CPUs which run mostly fixed sets of instructions, or fixed for the intents and purposes of basically everybody (microcode and FPGAs change this but let us not go there).
    Even on complex processors these instructions are quite short and sweet, older less complex stuff is even more basic.
    If you study logic gates you would have been shown how to make any other logic gate from a collection of NAND gates. http://www.electrical4u.com/universal-gate-nand-nor-gate-as-universal-gate/
    You would also probably be taught how to make a basic adding machine http://www.electronics-tutorials.ws/combination/comb_7.html
    To finish up I guess we will have a shift register as well http://www.electronics-tutorials.ws/sequential/seq_5.html and a flip flop because why not https://www.facstaff.bucknell.edu/mastascu/eLessonsHTML/Logic/Logic4.html

    Everything is adding. Adding is adding, multiplication is adding (2x4 == 2+2+2+2), subtraction is just adding from a different viewpoint, division is harder and tends to want logarithms (log (a/n) == log(a)-log(n)) but is adding once you have that. The shift register and flip flop are memory at its most basic form. Logic gates themselves are inherently the basis of loops -- you quite literally describe an AND gate as IF the inputs are high THEN the output is ELSE the output is low.

    From these basic blocks though you can build up quite complex things to calculate and do operations. https://stuff.pypt.lt/ggt80x86a/asm3.htm#first is not the one I was thinking of but gets you there.

    Interpreters then have lots of prebaked building blocks, though far more complex than conventional CPU instructions, that they can put together to run the code being interpreted. Compared to bare metal CPU stuff it is slow as sin, and likely quite restricted as well, but in the end any successful interpreter still does enough things* in short enough periods that humans are OK with it. Bonus with being restrictive is you can dodge some of the security concerns as well as you can vet things beforehand, or have it dynamically recalculate how large an array needs to be or that you are trying to use two objects of a different type together at the same time.

    *if you ever go look at an intel CPU manual you would see the hundreds of instructions that a modern CPU has, almost nobody will know what every one does does to the little quirks it has level, give or take being able to figure it out from the name. Most programs boil down to a core handful of instructions and the rest are nice when you have to manipulate that data type -- have a massive array, if you have to do everything one by one it takes an age, however SSE instructions come along and you can add a massive array to a massive array all at the same time in one instruction that takes a fraction of the time of simple adding. My grocery list program probably does not care but my video decoder does. Equally if you go back to the assembly tutorial I linked you will see all sorts of nonsense has to be done at the start of a program to set things up and that troubles a lot of people that just want to get in and do. This is why we have higher and higher level languages. Beyond that if you have prebaked building blocks which the main program calls then you can port all those building blocks to do the same thing on another OS or CPU and have the script written for one hopefully work just fine on another.
     
  3. EthanAddict
    OP

    EthanAddict Founder of Skiddon't-ism

    Member
    433
    1,927
    Nov 12, 2016
    Greece
    If I had the money and time, I would make a proccessor from AND, OR and NOT gates, also multiplication works by LSHIFT and RSHIFT. Not all instructions are adding...
     
  4. FAST6191

    FAST6191 Techromancer

    pip Reporter
    23,361
    9,153
    Nov 21, 2005
    Logic gates are cheap and breadboards are not expensive either. Equally there are many logic gate simulators
    https://academo.org/demos/logic-gate-simulator/

    Also though shifting is a nice quick way to multiply and integer divide by 2,4,8,16,32... it is still going to involve some addition and subtraction for many things. Equally not everything will be solely shifting and if your setup had logarithm lookup used at its core then it will still be adding.
     
  5. EthanAddict
    OP

    EthanAddict Founder of Skiddon't-ism

    Member
    433
    1,927
    Nov 12, 2016
    Greece
    I don't have time to design how everything will fit, the instruction set and I want to buy some other things for other projects, so... I still know how computers work(if not a big part).