Hi,

I have a friend who is looking to run a few simulations he has implemented in python and needs around 256GB of ram. He is estimating it will take a couple of hours, but he is studying economics so take that with a grain of salt 🤣

For this instance, I recommended GCP, but I felt a bit dirty doing that. So, I was wondering if any of you have a buttload of memory he can burrow? Generally, would you lend your RAM for a short amount of time to a stranger over the internet? (assuming internet acccess is limited to a signle ssh port, other necessary safeguards are in place)

  • markstos@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    12 hours ago

    Nope. Some algorithms are fastest when a whole data set is held into memory. You could design it to page data in from disk as needed, but it would be slower.

    OpenTripPlanner as an example will hold the entire road network of the US in memory for example for fast driving directions, and it uses the amount of RAM in that ballpark.

    • cevn@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      12 hours ago

      Sure, that is why I said usually. The fact that 2 people replied with the same OpenStreetMap data set is kinda proving my point.

      Also, do you need the entire US road system in memory if you are going somewhere 10 minutes away? Seems inefficient, but I am not an expert here. I guess it is one giant graph, if you slice it up, suddenly there are a bunch of loose ends that break the navigation.