- cross-posted to:
- memes@lemmy.ml
- cross-posted to:
- memes@lemmy.ml
I’d like to make it like that for my projects, but I don’t use windows so I can’t do well with packaging them. And sometimes when I try it runs in the computer, but then doesn’t run in other computers because of missing dlls or some other things.
Anyone have good idea how to make it easy. Using windows VM is such a hassle to install and such just for tiny programs I make.
Make them in a portable language. Something like Java for example. Or you can write in rust and compile for each target.
It’s in rust. Problem is the gtk part, it has to be installed in the system, which makes it run there. But how do I distribute the program without having everyone install gtk on their computer. In Linux it’s just a dependency so it’s not a problem, for windows I can’t seem to make it work.
Edit: also, I need gtk because people around me who uses windows aren’t going to use CLI program at all.
The 2024 animated movie Flow was done entirely in Blender. It is an incredible movie, highly recommend.
Blender was also used a bit in Everything Everywhere All At Once
Me running Godot on a new computer yesterday
Running FOSS on closed source systems. Classic.
It’a a start! Makes the switch much easier.
Agreed, OSS purity is silly. I am running an open source client (Thunder) to this open source service on my Pixel 9 running GraphineOS, the low level firmware is still absolutely proprietary.
Actually, to be clear, I don’t think FOSS purity is bad. I just mean that denigrating what others are doing because they’re using something non-free while they’re making steps in the right direction is dumb and counterproductive.
To my mind, FOSS is the only way forward for a healthy, functioning society, and the fact that so much of our digital landscape is being gradually replaced with it is to me evidence of that. I think the end goal should always be pure FOSS, but that doesn’t (necessarily) mean immediately jumping to all FOSS; it just means taking steps to cut out proprietary software wherever you reasonably can.
No. You either go full Stallman and inject Gentoo directly into your aorta, or you might as well be deep throating Satya Nutella while bouncing on Tim Apple’s lap. Filthy casual.
I just got blender after having last looked at it ten years ago. It looks so much better! I had an easy time finding stuff. If you tried it in the past and are afraid of how ugly it was it is worth another shot. Also look up the doughnut tutorial.
Oh it’s free so it lacks features
If you don’t count professional software, nowadays it’s actually the opposite. Very often in proprietary software there are features removed with no alternative provided by developers, or there’s one but actually it has nothing to do with what you actually want.
On a somewhat related note, why do so many open source projects give me a zip file with a single exe inside it instead of just the exe directly?
Because zipping it can reduce the size
Plus a lot of antivirus whatevers will straight up block the downloading of *.exe
EXE files don’t really compress well, plus the files should already be internally compressed when the exe is built.
A lot of exe files are secretly zip files. zip files can contain arbitrary data at the end of the file. exe files can have arbitrary data at the start of the file. It’s a match made at Microsoft.
If only it was that easy on Linux
One command line away?
If I have to go into DOS to do something a normal user wants to do, the GUI OS is a failure.
I right click to extract my zips on linux. Sure I can also go into cli to do it but you can have both
What are you talking about? There’s no DOS in Linux, and I am not sure what the hell would that even mean.
The fact you called it DOS feels like you are just rage-bait trolling… lol
No, I call any command-line interface that runs from an internal drive “DOS”. I do mean the term somewhat generically as a Disk Operating System.
Then you should stop doing that. Even if you are running modern Windows, there’s no DOS in it to be seen, even though command interpreter (
cmd.exe
) is very close to what was typically used in DOS (MS-DOS, PC-DOS, DR-DOS, etc.) -COMMAND.COM
. You are probably aware that the built-in commands there are actually very similar or the same as in MS-DOS. That’s because Microsoft didn’t want to make them different, probably mostly for compatibility reasons. There’s of course also PowerShell and Bash (and other Linux shells) if you run WSL (Windows Subsystem for Linux).And these command interpreters are always (on NT Windows at least) opened in a terminal application, typically the older Console Host or the newer Windows Terminal.
But… it is easy. And so is the package manager. You don’t have to use the command line if you don’t want to, it is just another option.
You got downvoted by the Linux fanboys, but it’s not wrong. Linux has a big issue with approachability… And one of the biggest reasons is that average Windows users think you need to be some sort of 1337 hackerman to even boot it, because it still relies on the terminal.
For those who know it, it’s easier. But for those who don’t, it feels like needing to learn hieroglyphs just to boot your programs. If Linux truly wants to become the default OS, it needs to be approachable to the average user. And the average user doesn’t even know how to access their email if the Chrome desktop icon moves.
You don’t really need to use any command-line interface or commands if you are running beginner-friendly Linux distro (Linux Mint, Zorin OS, etc.). Well, maybe except when things go very bad, but that’s very rare if you use your system like average user.
I ran Linux Mint for close to a year and never used the terminal. It’s not 2000 anymore
I’d say
{insert package manager} install blender
is easier.Yeah but then I get an ancient version because I use Debian.
I think the last time I used Blender I installed it through Steam.
Time to install flatpaks. It’s the future of userspace programs on Linux anyway, you’ll get newest versions there the quickest.
That is part of the deal with Debian. You get stable software… but you only get stable software. If you want bleeding edge software, you’ll have to install it manually to
/usr/local
, build from source and hope that you have the dependencies, or containerize it with Distrobox.If you go to a butcher, don’t complain about the lack of vegan options.
I’m aware and I’m not complaining. Just sharing what I thought was a funny story of using Steam as a package manager.
I don’t really like the way software installation is centralized on Linux. It feels like, Windows being the proprietary system, they don’t really care about how you get things to run. Linux the other hand cares about it a lot. Either you have to write your own software or interact with their ‘trusted sources’.
I would prefer if it was easier to simply run an executable file on my personal Linux machine.
The entire point is you don’t need to wait through a slow installer, you just open discover or software center and install whatever software you need. In addition to being easier and more intuitive its also more secure (you’re less likely to receive binaries from a malicious actor)
You can also just download any binary file you find online and run it. Or use any
install.sh
script you happen to find anywhere.Package managers are simply a convenient offer to manage packages with their dynamically linked libraries and keep them up to date (important for security). But it’s still just an offer.
The difference between a package manager and an app store is that the package manager allows you to pick your own sources. You can even run your own repository if you wanted to.
You can still do that on Linux. Just download it and run. You can even compile it from source if that’s your thing.
However, because there is a much greater variety of Linux distros and dependencies compared to Windows or MacOS versions, it’s better to either have a Flatpak, AppImage, or package from your distro’s repo. That way you’re ensured that it will work without too much fiddling around.
Do you know about AppImages? Seems like those meet the need you’re complaining about.
You still have to set the executable flag for them, but you can do that through the graphic user interface. No need to open a terminal.
Software Installation is all but centralized on Linux. Sure, there is your store or package manager, but both Apple and Windows do have that, too. But you can always add any source you want to that store (flatpak is great), find an AppImage, some doubious install script, find your own packages and manually install them (like .deb), use Steam or sometimes, like with Blender, download, decompress and run it.
As others have pointed out you can do this, but there are at least two major advantages to the way Linux distributions use package managers:
-
Shared libraries - on Windows most binaries will have their own code libraries rolled into them, which means that every program which uses that library has installed a copy of it on your hard drive, which is highly inefficient and wastes a lot of hard drive space, and means that when a new version of the library is released you still have to wait for each program developer to implement it in a new version of their binary. On Linux, applications installed via the package manager can share a single copy of common dependencies like code libraries, and that library can be updated separately from the applications that use it.
-
Easy updating - on Windows you would have to download new versions of each program individually and install them when a new version is released. If you don’t do this regularly in today’s internet-dependent world, you expose your system to a lot of vulnerabilities. With a Linux package manager you can simply issue the update command (e.g.
sudo apt upgrade
) and the package manager will download all the new versions of the applications and install them for you.
I can respect the value of point 1 - that’s nominally why we have .DLL files and the System32 folder, among other places. There are means to share libraries built into the OS, people just don’t bother for various reasons - as you said, version differences are a noted reason. It’s ‘inefficient’, but it hasn’t hurt the general user experience.
To point 2, the answer for me is simple: I don’t trust upgrades anymore - that’s not an OS-dependent problem, that’s an issue of programmers and and UI developers chasing mindless trends instead of maintaining a functioning experience from the get-go. They change the UX, they require newer and more expensive computers for their utterly pointless flashy nonsense, and generally it leads to upgrades and updates just being a problem for me. In a setting like mine where my PC is actually personal, I’m quite happy to keep a specific set of programs that are known to be working, and then only consider budging after I’m sure it won’t break my workflow. I don’t want all the software to update at once, that’s an absolute nightmare scenario to me and will lead to immediate defenestration of the PC when any of the programs I use changes its UI again. I’m still actively raging at Firefox for going to the Australis garbage appearance, and I first moved to LibreOffice just because OpenOffice switched to a “ribbon”. I’ve had that same thing happen to other programs. I’m done with it.
Once I decide I’m going to continue using a program for a purpose, I don’t want some genius monkeying about with how I use it.
And as far as security, I can use an AV software or malware scanner that updates the database without breaking the user experience. I don’t need anyone else worrying about security except the piece(s) of software specifically built to mind it.
-
That’s literally what I wrote on a satirical post about moving to Windows! https://lemmy.world/comment/14612934 Except I was being sarcastic and you’re being serious.
Yup. We probably have different use-cases and different kinds of BS tolerance. Your satire is my truth.
It’s literally how Blender is distributed. Get archive, extract wherever, run
blender
.“Your AppImage, Sir.”
You missed the /s
in blender’s case it literally is