I’m familiar with the early history, I’ve dabbled in it in modern times, and of course I’ve seen all the ways it’s bad memed about ad-infinitum (and have to agree).
I didn’t really think I had to be able to write a book on it to say it doesn’t deserve the use it gets, and I don’t think it’s outrageous to imagine that there’s a connection with the hasty genesis. So, I mentioned that off-hand. If it’s actually unconnected my bad.
The memes are annoying because most of the complaints are superficial.
“Look, "b" + "a" + +"a" + "a" outputs "baNaNa"! JavaScript bad!” Yeah, that’s what happens. Just don’t do that thing that obviously doesn’t look right. Don’t use var, just like every modern JavaScript learning resource will tell you. Don’t use == if you don’t intend for type coercing to happen.
If you don’t write bad code on purpose, JavaScript is fine.
Yes, it has no implicit conversions like JS or R. It does, however, allow you to not specify the type of a variable and even change it without complaining. Even if you add types these are only hints that won’t generate errors unless you use external type checking (e.g. mypy).
I guess the internet just grew that fast. The first arrival took all and locked everybody in.
Now, we have just two browsers that are widely used, so maybe we do have an opportunity to go back and fix it. Go sounds like it’s a pretty popular choice for statically typed, imperative high-level language.
Honestly, given the context of a browser, Javascript’s “Everything is better than crashing” philosophy does not seem too out-of-place. Yes, the website might break, but at least it would be theoretically usable still.
Yes, a statically typed language would help, but I’d rather not have one that is “these two types are slightly different, fuck you, have a segfault”, but rather one that is slightly more flexible.
Actually, that’s a good point, in scripting fatal type errors can happen at runtime. I guess Python is the right choice then, given it’s maturity and popularity, and then you can code the complex stuff in whatever you want via WASM like other people mentioned.
Not even “not so bad”, I would say that as a scripting language it’s fantastic. If I’m writing any actually complex code, then static typing is much easier to work with, but if you want to hack together some stuff, python is great.
It also interfaces extremely easily with C++ through pybind11, so for most of what I do, I end up writing the major code base in C++ and a lightweight wrapper in Python, because then you don’t have to think about anything when using the lib, just hack away in dynamically typed Python, and let your compiled C++ do the heavy lifting.
Python is actually mostly strongly typed. Strongly (e.g. can’t use a number as a string without explicitly converting it), but dynamically (can change type of variable at runtime). You probably would prefer a statically typed language and I agree.
Alright, thanks for the help with terminology. I’m a bit confused about changing types at runtime. I thought a compiled or interpreted language stopped having types at runtime, because at that point it’s all in assembly. (In this case of course it’s scripting, which someone pointed out to me elsewhere)
That’s really only native compiled languages. Many popular languages, such as C#, Java, etc. Lie somewhere in between. They get compiled to intermediary byte code and only go native as the very final step when running. They run in a runtime environment that handles that final step to execute the code natively. For .NET languages that’s the CLR (Common Language Runtime).
For .Net the process goes like this:
You write the code
Code is compiled to MSIL
At runtime when the MSIL is executing a JIT (just-in-time) compiler translates the MSIL into native code.
The native code is executed.
Java has a similar process that runs on the JVM. This includes many, many languages that run on the JVM.
JavaScript in the browser goes through a similar process these days without the intermediary byte code. Correction, JS in modern browsers also follow this process almost exactly. a JIT compiler compiles to bytecode which is then executed by the browser’s JS engine. Historically JS has been entirely interpreted but that’s no longer the case. Pure interpreted languages are pretty few and far between. Most we think of as interpreted are actually compiled, but transparently as far as the dev is concerned.
Last, but certainly not least, Python is also a compiled language, it’s just usually transparent to the developer. When you execute a python program, the python compiler also produces an intermediary bytecode that is then executed by the python runtime.
All that being said, I welcome any corrections or clarifications to what I’ve written.
I did know the difference, but I didn’t realise it ran one line at a time! I had kind of assumed it at least did one pass through everything before giving output. Thanks.
I believe it does “one pass” when it loads the code into ram, because syntax errors can be caught before anything runs. But I think the actual interpretation happens pretty much one line at a time :)
Yes. Please. Although something strongly typed would be even better. It’s ridiculous the world runs on a language built in 2 weeks.
It’s also ridiculous to think it’s still the same language that was built in two weeks, like absolutely no work was done in it over time.
I’ll admit I don’t really know the history of the language between then and now. Please don’t tell me the crazy stuff was somehow added later.
If you don’t know the history, why are you so confidently talking about JavaScript being built in 2 weeks?
I’m familiar with the early history, I’ve dabbled in it in modern times, and of course I’ve seen all the ways it’s bad memed about ad-infinitum (and have to agree).
I didn’t really think I had to be able to write a book on it to say it doesn’t deserve the use it gets, and I don’t think it’s outrageous to imagine that there’s a connection with the hasty genesis. So, I mentioned that off-hand. If it’s actually unconnected my bad.
The memes are annoying because most of the complaints are superficial.
“Look,
"b" + "a" + +"a" + "a"
outputs"baNaNa"
! JavaScript bad!” Yeah, that’s what happens. Just don’t do that thing that obviously doesn’t look right. Don’t usevar
, just like every modern JavaScript learning resource will tell you. Don’t use==
if you don’t intend for type coercing to happen.If you don’t write bad code on purpose, JavaScript is fine.
If coding teaches you anything well, there’s no bound to the different ways you can screw up. Don’t use bad languages on purpose.
It’s gotten better. Still some weird stuff but it has cool stuff too now.
Python is strongly typed, but it is also dynamically typed.
TIL. Obviously I’ve avoided using it much.
So how does that work? Is there a few implicit conversions that are allowed, but if you really write something weird it will complain?
Yes, it has no implicit conversions like JS or R. It does, however, allow you to not specify the type of a variable and even change it without complaining. Even if you add types these are only hints that won’t generate errors unless you use external type checking (e.g. mypy).
example:
throws an error because i is double and the list-index expects an integer.
so for it to work the code needs to look like this:
meanwhile this works:
Isn’t
//
integer division?It is but if you start with a float you get a float back.
You’re right, I did not know that. Thanks!
Was really surprised by this too, because iirc Python 2 did not do this.
you can do
i: int
to make this error outNo, type hints are not enforced.
damn
In python you always have the right type, cause everything is an object
And yet somehow it evolved to become something that will last to the heat death of the universe.
I’ve grown used to it with time, though. Once you know it’s “quirks”, it’s not so bad.
I guess the internet just grew that fast. The first arrival took all and locked everybody in.
Now, we have just two browsers that are widely used, so maybe we do have an opportunity to go back and fix it. Go sounds like it’s a pretty popular choice for statically typed, imperative high-level language.
Honestly, given the context of a browser, Javascript’s “Everything is better than crashing” philosophy does not seem too out-of-place. Yes, the website might break, but at least it would be theoretically usable still.
Yes, a statically typed language would help, but I’d rather not have one that is “these two types are slightly different, fuck you, have a segfault”, but rather one that is slightly more flexible.
Actually, that’s a good point, in scripting fatal type errors can happen at runtime. I guess Python is the right choice then, given it’s maturity and popularity, and then you can code the complex stuff in whatever you want via WASM like other people mentioned.
Not even “not so bad”, I would say that as a scripting language it’s fantastic. If I’m writing any actually complex code, then static typing is much easier to work with, but if you want to hack together some stuff, python is great.
It also interfaces extremely easily with C++ through pybind11, so for most of what I do, I end up writing the major code base in C++ and a lightweight wrapper in Python, because then you don’t have to think about anything when using the lib, just hack away in dynamically typed Python, and let your compiled C++ do the heavy lifting.
Python is actually mostly strongly typed. Strongly (e.g. can’t use a number as a string without explicitly converting it), but dynamically (can change type of variable at runtime). You probably would prefer a statically typed language and I agree.
Alright, thanks for the help with terminology. I’m a bit confused about changing types at runtime. I thought a compiled or interpreted language stopped having types at runtime, because at that point it’s all in assembly. (In this case of course it’s scripting, which someone pointed out to me elsewhere)
That’s a compiled language, an interpreted language is translated to assembly at runtime, in pythons case: pretty much one line at a time.
Disclaimer: To the best of my knowledge, please correct me where I’m wrong.
That’s really only native compiled languages. Many popular languages, such as C#, Java, etc. Lie somewhere in between. They get compiled to intermediary byte code and only go native as the very final step when running. They run in a runtime environment that handles that final step to execute the code natively. For .NET languages that’s the CLR (Common Language Runtime).
For .Net the process goes like this:
Java has a similar process that runs on the JVM. This includes many, many languages that run on the JVM.
JavaScript in the browser goes through a similar process these days without the intermediary byte code.Correction, JS in modern browsers also follow this process almost exactly. a JIT compiler compiles to bytecode which is then executed by the browser’s JS engine. Historically JS has been entirely interpreted but that’s no longer the case. Pure interpreted languages are pretty few and far between. Most we think of as interpreted are actually compiled, but transparently as far as the dev is concerned.Last, but certainly not least, Python is also a compiled language, it’s just usually transparent to the developer. When you execute a python program, the python compiler also produces an intermediary bytecode that is then executed by the python runtime.
All that being said, I welcome any corrections or clarifications to what I’ve written.
I did know the difference, but I didn’t realise it ran one line at a time! I had kind of assumed it at least did one pass through everything before giving output. Thanks.
I believe it does “one pass” when it loads the code into ram, because syntax errors can be caught before anything runs. But I think the actual interpretation happens pretty much one line at a time :)
Typescript!
[This comment has been deleted by an automated system]
Couldn’t agree more.
Java Applets!