- cross-posted to:
- memes@lemmy.ml
- cross-posted to:
- memes@lemmy.ml
Chrome is such a pile of of shit.
compile my own custom android
I bought two double rank 16gb sticks so I had the option of 64gb later, but with zero page file I have not had a single issue. I also horde tabs when online shopping, but I use extensions to prevent them from loading until selected. Still, never even been over 25gb in daily use and gaming. I also have a 3090 though so that 24gb vram soaks up a lot.
I “cheaped out” with 32 and regretted it, working with huge files in RAM.
getting into 3d art is a regret lowkey, I was fine with my specs before they felt op even
I was into vr too, I was like damn this laptops a beast now im constantly struggling
128 here and I capped it the other day doing an in-memory parsing, lol
16 here, because 🍎
How much is that in banana, for scale?
Let’s see, it would take 3 Orangutans to make a Macbook. That’s 36 bananas
took me a few days but I fully switched to firefox. my computer finally runs the way it should.
Ecosia is another option but it is also Chromium based
Yeah we should stop using Chromium based browsers, because they favour Google indirectly.
My washing machine uses Firefox as well
i’m an angry non orgasmic bottom
Linux and FreeBSD systems? Happy and snappy.
Work Windows system filled with crap corp security software? Open electron apps and wait for them to load.
Personal Windows system? Master of Orion, the remake.
I do wonder how many people only hate windows because their IT installed crapware that takes half the CPU scanning every file move.
I watched a fascinating rust video where this guy was talking about all of the things different OSs do differently just in the rust up install process. And how one of them (I assume windows but don’t recall) was way worse but it was fixed by changing how they did IO. I don’t work at that lower level so it’s not a thing for me, but it was interesting. (I tried to find but failed)
The file system Windows uses (NTFS) has a lot of neat features, but ends up being astronomically slow in unexpected ways for some file operations as a result.
I remember playing around with NTFS streams. They’re usually used to store random metadata about a file. The size of which doesn’t appear in the normal file size calculation/display in Windows. So you can have this 2kb text file that has an alternate stream with a zip file of the entire discography of a band stuffed into it. Longest file transfer of 2kb ever. Another gotcha, the second you copy that file to a file system that doesn’t support the alternate streams they just vanish. So all the sudden that long file transfer is super quick.
what do you need 32gb for in a linux box?
Chrome. Or Firefox. They don’t use less ram on Linux. I can easily get Firefox to 10G.
Makes your penis bigger
ZFS and disk cache.
I love it for disk cache as I can then get slow drives. After the prefetch during boot (once every few months), things are just smooth.
either using it to serve a small network or the old video games
32gb of ram
yes, that was the amount I was responding to
Play some horribly unoptimized games, like the Oblivion Remaster that recommends having 32gb. Which is fucking insane.
Yeah, they defo need to work on optimization. It’s an unreasonably heavy game both on cpu and gpu, runs worse than cyberpunk rt overdrive while looking worse…
Oblivion needs about stable 10GB with max settings on my System.
I wonder if they recommend 32 because 16gb is the minimum for maxed settings vanilla but they know everyone will have 2000 mods installed eventually and they are accounting for that. 🤔
The remake?
Ditch Google trash. Go for alternatives. E.g., Firefox instead of Chrome.
Firefox isn’t going to solve the issue of overly bloated websites.
can help
That seems like a fake website. Here is the real link: https://github.com/gorhill/uBlock
This website is maintained by Uros Gazvoda, founder of Futuristica, to help spread uBlock Origin - free, open-source ad blocker.
Not fake, just unaffiliated. The presented links and info seem correct to me.
ubo is also warning about a different fake page on their repo but not this one.
May as well chop the “s” off “alternatives”
“Two Firefox tabs at the same time, man”
That why I have 128gb of ram — productivity. I even can open tab with youtube and oom won’t kill my family.
Are you sure that Firefox is a good alternative, given all the things they’ve been doing lately? And why not fork, by the way?
Also, Google is funding Mozilla, so… yeah.
Every other browser uses chrome and just puts their stuff on top. So you either use actual Google chrome, a rebrand (I know it’s not exactly that nor that simple), or you use Firefox.
theres always librewolf
Hence the joke 🤣
I’d be in trouble, since between ZFS and my various VMs, my system idles at ~170 GB RAM used. With only 32 I’d have to shut basically everything down.
My previous system had 64 GB, and while it wasn’t great, I got by. Then one of the motherboard slots died and dropped me to 48 GB, which seriously hurt. That’s when I decided to rebuild and went to 256.
Oh yay, lemmy is finally popular enough to have a nobody asked e-peen guy!
NERD!
seriously, nice rig phat stats
Real question. Doesn’t the computer actually slow down when you have that much memory? Doesn’t the CPU need to seek into a bigger vast vs a smaller memory set?
Or is this an old school way of thinking?
That’s a complicated question. Bigger memory can split it between more banks, which can mean more precharge penalties if the memory you need to access is spread out between them.
But big memory systems generally use workstation or server processors, which means more memory channels, which means the system can access multiple regions of memory simultaneously. Mini-PCs and laptops generally only have one memory controller, higher end laptops and desktops usually have two, workstations often have 4, and big servers can have 8+. That’s huge for parallel workflows and virtualization.
No that’s not how it works. Handling a larger address space (e.g., 32-bit vs 64-bit) maybe could affect speed between same sized modules on a very old CPU but I’m not sure that’s even the case by any noticeable margin.
The RA in RAM stands for random access; there is no seeking necessary.
Technically at a very low level size probably affects speed, but not to any degree you’d notice. RAM speed is actually positively correlated with size, but that’s more because newer memory modules are both generally both bigger and faster.
The RA in RAM stands for random access; there is no seeking necessary.
Well there is, CPUs need to map virtual addresses to physical ones. And the more RAM you have the more management of that memory you need to do (e.g. modern Intel and AMD CPUs have 5 levels of indirection between a virtual and physical address)
But it also caches those address mappings, as long as your TLB is happy, you’re happy. An alternative is to use larger page sizes (A page being the smallest amount of RAM you can address), the larger the page the less you need recurse into the page tables to actually find said page, but you also can end up wasting RAM if you’re not careful.
You clearly know more than me, but wouldn’t everything from 4GB to 1TB have the same number of walks? And one more walk gets you up to 256TB?
So one of the problems is the size of a “physical page”, on a stock x86 system that’s only 4KiB. If you allocate just 1MiB of RAM you need to back that with 256 “page table entries”, and to then load a virtual address within that allocation you need to walk that list of 256 entries to find the physical address in RAM that the CPU needs to request.
Of course these days an app is more likely to use 1 GiB of RAM, that’s a mere 262,144 page table entries to scan through, on each memory load.
Oh but then we’re also not running a single process, there’s multiple processes on the system, so there will be several million of these entries, each one indexed by address (Which can be duplicated, each process has its own private view of the address space), and then by process ID to disambiguate which entry belongs to each process.
That’s where the TLB comes in handy, to avoid the million or so indexing operations on each and every memory load.
But caching alone can’t solve everything, you need a smarter way to perform bookkeeping than simply using a flat list for when you don’t have a cached result. So the OS breaks down those mappings into smaller chunks and then provides a table that maps address ranges to those chunks. An OS might cap a list of PTEs at 4096 and have another table index that, so to resolve an address the CPU checks which block of PTEs to load from the first table and then only has to scan the list it points to.
Like this, this is a 2 level scheme that Intel CPUs used before the Pentium Pro (iirc), the top 10 bits of an address selected an entry in the “page directory”, the CPU loads that and uses the next 10 bits to select the group of PTEs from that list, following that link that it finds the actual PTEs that describe the mappings and then it can scan that list to find the specific matching entry that describes the physical address to load (And it then promptly caches the result to avoid doing that again)
So yes, for a given page size and CPU you have a fixed number of walks regardless of where the address lives in memory, but we also have more memory now. And much like a hoarder, the more space we have to store things, the more things we do store, and the more disorganised it gets. And even if you do clear a spot, the next thing you want to store might not fit there and you end up storing it someplace else. If you end up bouncing around looking for things you end up thrashing the TLB, throwing out cached entries you still need so now need to perform the entire table walk again (Just to invariably throw that result away soon after).
Basically, you need to defrag your RAM periodically so that the mappings don’t get too complex and slow things down (Same is true for SSDs btw, you still need to defrag them to clean up the filesystem metadata itself, just less often than HDDs). Meta have been working on improvements to how Linux handles all this stuff (page table layout and memory compaction) for a while because they were seeing some of their long-lived servers ending up spending about 20% of CPU time simply wasted on doing repetitive walks due to a highly fragmented address space.
Well, I don’t think I need that much RAM, but it’s a funny joke, modern browsers consume an insane amount of RAM.
not all of them, but def most. in my experience Firefox, tor, and librewolf have been pretty good in that regard
That’s the same engine for all three (FF) so should get fairly similar results
Is it Chrome or is it the web page
Maybe it’s Maybelline
My mid-range gaming PC from 2019 had 16gb, and I was looking at some new pre-builts and saw many still only have 16. Is there just not much need for more, or what? It’s cheap - I might double what I’ve got in DDR4 for $50.
If you’re doing a new PC then I’d aim for 32GB.
16GB is enough, yes, but for how much longer? It’s been the norm for awhile now, which means that soon it won’t be enough.
I would say, that 16gb is barely enough, if you planning gaming. UE5 games can easily fill up that, so if you wanna have game recording or browser in background, then above 16 is mandatory. Maybe at least 24gb.
sort of what I was thinking. I only hit the limit when I have way too many tabs open while playing an intensive game, but it’s a cheap upgrade that might keep me in this PC for a few more years.
I have 32gb and I always suggest others to get that much as a baseline theses days. I rarely ever use anywhere close to the full 32gb, but I am often times at or near 16gb in use. The main benefit of having 32gb is in my case I’ll basically never be hitting the pagefile, but if you only had 16gb you’ll probably rarely max out on ram usage, but you’ll probably be hitting the pagefile more often.
With the proliferation of fast SSDs and NVME drives hitting the pagefile is considerably less impactful than it use to be with spinning disks, but it’s still slower than RAM.
I upgraded from 16 to 64 a few months ago, and kinda regret not going for 96 instead. Hoping these 64 last around a decade like the 16GB did previously
For gaming it is unlikely you will need more than 16, at least not any time soon.
Depends on the game and what you’re doing with it. Cities: Skylines with a bunch of mods really struggles without a load of RAM. Playing Vintage Story recently, I installed a bunch of mods. Had to uninstall about half to come in under 32GB utilization.
I have a VR headset. Going from 16GB to 64GB was a huge difference in most games
I’ve just added 64GB to the 16 that were fitted, because beam.ng would crash loading Utah with mods. This was less time consuming than finding the probably misbehaving mod or other root cause. Mainboard is from an old Thinkstation so that RDIMMs only set me back 40EUR. Nice experience.
Mahalo, friend.
16GB is perfectly enough currently and I expect it to stay so for the next 5+ years.
False.
16gb is maybe enough for a phone these days.
32gb is the bare minimum baseline, and if you want to game AND use a browser you should be seriously moving to 64gb
And if you’re a power user of ANY kind, go straight to 128.
The only people who need 128 or more do not need to ask.
So, if you only browse the web with a few other programs, and you have less than 20 tabs in a browser, 32gb.
If you go over 20 tabs and want to game at the same time, just go to 64gb.
I would say 16 GB is the bare minimum. Oblivion for example needs about 10 GB, If you have discord, your Browser, and 1-2 other programs running in the background simultaneously, you will easily reach your limit.
And that’s probably still relying HEAVILY on a pagefile.
Not really. 16 gigs is like the base amount of VRAM on the new 5xxx series nvidia GPU’s, and you probably want more RAM than VRAM in your rig…
With 16 GB ram I can perfectly virtualize W11 giving 8 GB ram to the guest (on a Linux host), so yes, for normal use 16 GB is perfectly fine.
Cool, danke.
Tbh I can already do that with 8gb
Yeah but try 3 mfer
I’m gonna download more RAM right meow! https://downloadmoreram.com/
Upgrade to 64GB cause 32 is not enough for my adhd
Then upgrade from 64gb to 128gb because it’s still not enough for my adhd
I’ve actually been considering using 128gb recently. I’m only considering this as I’m thinking about turning a server of mine into my primary desktop and it has 128gb in it already because I was using a RAM disk to generate large files in memory. I’m now done with that project and it feels silly having this powerful PC sitting here doing nothing.
A single vote cannot convey my interest in this idea.