Who would buy a Raspberry Pi for $120?

That is indeed a puzzling question, brought about by Raspberry Pi's introduction of the newest Raspberry Pi 5 model, with 16 GB of RAM.

Raspberry Pi 5 16GB RAM

I spent a couple weeks testing the new Pi model, and found it does have its virtues. But being only $20 away from a complete GMKTec N100 Mini PC with 8GB of RAM, it's probably a step too far for most people.

Pi 5 model B pricing from 2 to 16 GB

For most, the 2 GB ($50) or 4 GB ($60) Pi 5 is a much better option. Or if you're truly budget conscious and want a well-supported SBC, the Pi 4 still exists, and starts at $35. Or a Pi Zero 2 W for $15.

And for stats nerds, the pricing model for Pi 5 follows this polynomial curve almost perfectly:

Pi 5 model B pricing curve polynomial fit

...very much unlike Apple's memory and storage pricing for the M4 Mac mini, which follows an equation that ranges from "excellent deal" to "exorbitant overcharge".

Performance

Before I get to the reasons why some people might consider spending $120 on a Pi 5, I ran a bunch of benchmarks, and one of the more pertinent results is HPL:

High Performance Linpack - Pi 5 model B 8 GB vs 16 GB

This compares the performance of the 8 GB Pi 5 against the new 16 GB model. For many benchmarks, the biggest difference may be caused by the 16 GB model having the newer, trimmer D0 stepping of the BCM2712. But for some, having more RAM helps, too.

Applications like ZFS can cache more files with more RAM, leading to lower latency and higher bandwidth file copies—in certain conditions.

For all my 16 GB Pi 5 benchmarks, see this follow-up comment on my Pi 5 sbc-reviews thread.

5 Reasons a 16 GB Pi 5 should exist

Raspberry Pi 5 eGPU setup playing Forza Horizon 4 Benchmark

But I distilled my thoughts into a list of 5 reasons the 16 GB Pi 5 ought to exist:

  1. Keeping up with the Joneses: Everyone seems to be settling on 16 GB of RAM as the new laptop/desktop baseline—even Apple, a company notoriously stingy on RAM in its products! So having a high-end SBC with the same amount of RAM as a low-end desktop makes sense, if for no other reason than just to have it available.
  2. LLMs and 'AI': Love it or hate it, Large Language Models love RAM. The more, the merrier. The 8 GB Pi 5 can only handle up to an 8 billion parameter model, like llama3.1:8b. The 16 GB model can run much larger models, like llama2:13b. Whether getting 1-2 tokens/s on such a large model on a Pi 5 is useful is up to you to decide. I posted my Ollama benchmarks results in this issue
  3. Performance: I already discussed this above, but along with the latest SDRAM tuning the Pi engineers worked on, this Pi is now the fastest and most efficient, especially owing to the newer D0 chip revision.
  4. Capacity and Consolidation: With more RAM, you can run more apps, or more threads. For example, a Pi 5 with 4 GB of RAM could run one Drupal or Wordpress website comfortably. With 16 GB, you could conceivably run three or four websites with decent traffic, assuming you're not CPU-bound. You could also run more instances on the same Pi of Docker containers or pimox VMs.
  5. AAA Gaming: This is, of course, a stretch... but there are some modern AAA games that I had trouble with on my eGPU Pi 5 setup which ran out of system memory on the 8 GB Pi 5, causing thrashing and lockups. For example: Forza Horizon 4, which seemed to enjoy using about 8 GB of system RAM total (alongside the 1 GB or so required by the OS and Steam).

I have a full video covering the Pi 5 16GB, along with illustrations of some of the above points. You can watch it below:

Comments

By looking at benchmarks that do not access RAM at all (for example 'openssl speed -elapsed -evp aes-256-cbc') we can see that SoC revision does (close to) nothing: you now measured on your 16GB D0 board 1368167 (one time 1368096, the other 1368238), I measured on a 4GB board 1368001 back in March (one time 1367818, the other 1368184). That's a 0.01% 'improvement' or in other words: results variation :)

Talking about the comparison with the 8GB variant above: are these numbers also made with latest firmware from Jan 8th?

The 8 GB was on a Jan 6th revision, but according to the Pi engineer who told me about the update, it only changed timings on the D0 models (I think... could've misinterpreted it).

The Linpack performance is higher because you have more RAM and can use larger matrix N. To be fair, use the same matrix size N on both. IMHO you will not see a significant difference in that case.

The graph puzzles me, what x represents in the fit?

If it was the amount of the memory you would have received a linear fit of Price = 20 * GB +40 with a a perfect fit

Jeff,
Price is precisely linear, not polynomial. $5/GiB (price= $40 + $5 * xGiB)

The graph isn't spaced correctly on the x axis, which causes confusion.

If you require 16gb of ram, chances are you'll also require nvme storage. At that point I suspect you're better off with an Intel N100 unless you need the rpi IO. With much higher performance, the N100 honestly looks like the better deal here.

Regarding the RAM pricing graph: Not sure if I'm missing anything but isn't the pricing linear? You pay 10$ per 2GB increment (spacing on x axis is distorted).

Is the D0 coming to the smaller variants? And if so, how would one know if one was getting the D0 or not?

My take on it, as a maker and maintainer of FOSS, is that while *maybe* my end-users don't directly need such a machine to run the programs, I (or my CI farms) certainly benefit from having a beefier faster machine to build them. Even with same-ish CPU performance, more caching and a bigger RAM disk (tmpfs) for scratch workspaces allows for faster build/test turnaround, perhaps spinning up several scenarios in parallel, ultimately benefitting the users as they get tested features quicker.

As VMs/containers were mentioned, yes - such a Pi can also help me run more platforms for manual bug investigations for the popular platform (frankly, builds could be done on x86 cross hosts; however confirming issues and solutions, with USB of the Pi, endianness, or whatever, can not be dobe so virtually).

Maybe an official support for Windows on Arm is on the way for Pi5, where 16GB make sense. Or a Pi 500 "pro" with 16GB, M.2 SSD and PoE for enterprise market (with Windows 11 Pro). Can I dream?

I would note one major use for these boards is for armv7l building. (You just have to boot into the kernel8 kernels with the smaller page sizes since the default 2712 kernels don't support armv7l.)

We use multiple 8Gb RPI5 boards for armv7l GitHub action runners to build software for the armv7l variant of Chromebrew.

When needed, it may make sense for us to use the 16Gb boards.

Your arguments might be right but that chart is totally misleading. The price on y is on a linear axis but the memory on x is on a logarithmic axis (2 4 8 16 evenly spaced)

Surely this is being built for some high volume enterprise customers who asked for it right? And then consumers get access to it because why not.

Jeff, I really enjoyed this last video and your reasoned data driven takes. You gave real world examples of why you might benefit from the 16GB version, and why a lower RAM version might be just as great for different use cases. Currently, I'm on the "I want a RPi5", and at first thought only the highest GB version would be the best, but honestly for what I would like to use it for, running containerized apps on my homelab with at most 4 users, I probably can pass on the 16GB version and save some cash for the power adapter or a cool hat for an SSD. I'm the guy who uses all the computers that others throw away to run in my homelab. All the RPi's that I have came into our home as gifts to myself or my son. Currently, I have two RPi 1's with 512MB of RAM running everyday. One RPi1 is a Bastion Host Box running tailscale as an exit node at the work office giving me access to my linux computers over there from home and back to home connected to my Pfsense Box (14 year old i3 Dell Optiplex) running tailscale as an exit node. The second RPi 1 is running tailscale allowing me to connect to it across the state to act as a backup node running borg backup, giving me my 3 source backup that is also offsite. 12 year old Asus tower and another Dell Optiplex serve as the main app server, and backup storage server in the homelab. My son is using a RPi 4 as his Home Assistant Hub. This leaves currently an RPi 3 and RPi 3+ without jobs. My Linux workstations are all computers that were being thrown out, another Asus tower i3 with 8GB of RAM, a HP i7 with Optiumus Nvidia GPU 17" laptop with a broken screen and 16GB of RAM, and a Chromebook with hacked firmware and 2GB of RAM as my mobile "fun and vacation" computer. Your positive videos have me "wanting" the lastest in ARM tech, but also inspire me to continue getting the most out of those computers that at best would end up at an electronics recycler or at worst in a landfill. Thanks Jeff for your inspiration and well thought out videos and continuing to advocate for Free and Open Source Software.

Perfect curve fit:
Pi 5 price = 40+5M, where M is memory in GB.

I wonder how well it's going to fare in the future as stuff starts requiring more RAM over time compared to the 8GB model...

I wouldn't. But that's because I know that unless you're running Windows, you probably don't need that much (for lightweight use). My guess is that Raspberry are selling these because a lot of people don't know that in most cases 8GB of RAM should be plenty for these computers.

-A guy that owns a Chromebook with 4GB of RAM.

Lets see a 'why buy that pi5 at all' investigation please. Times have changed. There are better alternatives now for desktop users.

To me, spending $160 for a 16GB pi5 for the card alone makes zero sense compared to any N100 x86_64 box that would come all integrated already with case, power supply, etc. for less money.

https://www.amazon.com/Beelink-Intel-N100-Computer-Desktop-Display/dp/B… as one example.

I still think the 4GB pi4 is the pi sweet spot for non-desktop uses. Stable, low power, runs cool enough for passive. Plenty of horsepower for small home server needs.