I am curious why is hackernews always so happy and obsessed with lisp, there is almost daily a lisp post on the frontpage. I dont really care/mind but I am just curious as why people love lisp so much. The only thing that I notice about Lisp are the bracket jokes and memes.
Paul Graham and Robert Morris built one of the first web applications (that's a PG claim, btw) that allowed users to build their own, separate websites that had shopping carts, custom UI, etc.
They wrote the app in Common Lisp and claimed that it was their secret weapon which allowed them to iterate faster and stay ahead of their competition. They ended up selling their company (ViaWeb) to Yahoo for $50 million (IIRC).
PG also authored two books on Lisp, and HN is written in Arc (a Lisp built on top of Racket).
Around 2005 I was doing a lot of Lisp programming and I was really interested in what PG had to say. He was writing a lot of essays then and giving lots of talks, etc.
Unfortunately, I don't get the opportunity to use Lisp as much as I used to, but I'm doing a lot of Python and even played around with Hy (Lisp on top of Python). So I'm interested in Lisp as I find it to be an incredibly powerful language that is really fun to use.
After being a user of languages for a long time, writing my own LISP was the experience of leaving Platons cave and seeing what the thing I use daily really is made of. What is "if"? What is a boolean, a variable, a function?
It's a beautiful thing. I wish I would never have to climb down into the cave (languages with more advanced parsers) again.
Interesting enough, the primitive versions of LISP that aren't high-performance (i.e. advanced compilers) could probably be done by hand in hardware where the whole thing was bootstrapped up with LISP-only tech. The LISP would start/stop at the abstract, state machines or RTL of the bootstrapped version.
EDIT: Good luck on making it through the hurricane. Feel for yall out there.
Learning assembly is great, and every programmer should dive into at least a simpler version of it, but it's not exactly the best way to learn how your high-level language is implementing tail-call optimization.
Which you can code in a convenient way by using Lisp and Lisp macros to generate the <insert target CPU> machine language output using <insert target CPU> opcodes.
> Every time someone says this there seem to be nothing but haters.
Because it's not completely true. If a CPU is microcoded, then it's accurate to say "assembly is interpreted" because every instruction is effectively an address into a lookup table of microinstructions. But in a non-microcoded (e.g. purely RISC) CPU, the bits of the instruction word are effectively enable and data lines to a bunch of hardware logic gates and flip-flops, which cause register transfers and arithmetic operations to happen. In this case, the ones and zeros in the instruction word are voltage levels on logic gates. Calling the latter "interpretation" is a stretch.
To be fair, there aren't many pure RISC implementations around these days. Most everything has some degree of microcode involved, so to that extent you're right.
It's interpreted because the instructions are fetched one by one. A piano roll is intepreted, even though its holes just activate keys with a "horizontal encoding". It is interpreted because it moves through the piano, and a little piece of it activates a behavior any one time, without leaving a permanent record.
Not only is machine code interpreted, the so-called "asynchronous interrupts" are just periodically polled for in between the fetching, decoding and executing.
I'll use x86-32 for elaboration [1]. When the CPU sees the byte sequence 0xB8 0x90 0x41 0x5A 0x7B, it has to interpet what those bytes mean. It sees the 0xB8, so then it knows that you are loading the EAX register with an immedate value. The next four bytes (0x90, 0x41, 0x5A, 0x7B) are read and stored into EAX (as 0x7B5A4190, because Intel is little endian).
That is the case for all instructions. Each one is interpreted by the CPU. And for modern CPUs, even translated into an internal format that is futher interpreted.
[1] Sans power right now. Using my cellphone as a hot-spot and my iPad as a laptop. The aftermath of hurricanes is brutal in Florida.
Gotcha, but then (unless I'm misunderstanding) one of these interpreters is not like the other. Namely, assembly 'interpretation' happens on bare metal. Were you previously suggesting that understanding a lisp interpreter will help in understanding CPU architecture?
(Good luck recovering from the hurricane! Keep your head down!)
I was replying more to jospar's post about learning what an 'if' was, what a 'boolean' was, etc. What's an 'if'? Ultimately it's a comparison of two numbers and a transfer of control based upon said comparison. Some architectures that's one instruction, some two.
Lisps have properties that are not so common in other languages:
- Homoiconicity: Code and data are the same (s-expressions)
- Lisp code is basically the AST in itself
- It's trivial to implement lisp in lisp (eval)
- Continuations (call/cc)
- Macros
- etc...
It's a truly fascinating language.
I never really did any Lisp coding outside some university projects and I still obsess and read about Scheme all the time.
>I am just curious as why people love lisp so much.
Lisp is addictive, like a hard drug.
Lisp is the original "programmable programming language".
Experienced programmers with no previous knowledge of Lisp should relish (or doubt in disbelief) at the features provided.
Lisp, at least Common Lisp (but other Lisp dialects as well) might claim to be the most powerful and versatile programming language (and environment) available.
CL can run pretty fast. At the same speed than Java on the oracle JVM, and if some tricks are applied it should match C and Fortran speed under certain conditions. CL code is extremely portable.
Lisp has been used for very high level tasks (AI, etc) and for low level tasks (operating systems for Lisp machines).
Lisp is one of those languages that, once I learned it (Clojure in my case), I started wondering why every language wasn't more like it. Once you get used to the parentheses and whatnot, it's actually an incredibly beautiful language that gives you the ability to augment the language without waiting for updates via its elegant macro system.