6️⃣9️⃣4️⃣2️⃣0️⃣

  • 0 Posts
  • 39 Comments
Joined 2 years ago
cake
Cake day: October 28th, 2023

help-circle


  • This is a good read and makes a lot of great points. I think everyone in tech needs to understand the arguments here. The biggest thing for me is that LLMs are incredibly useful tools, but not in the way they are advertised. They are great for learning how existing code works, but shit at writing anything novel or innovative. From the article:

    The past is a prison when you’re inventing the future.

    In my opinion, if you’re using LLMs to do anything but help you learn from the past, you’re doing it wrong. LLMs cannot move you forward, and I think that may be the point.








  • Assuming your local service is accessible from the nginx server, you can proxy the request to it:

    server {
      listen 80;
      location / {
        proxy_pass http://10.100.100.2:3000/;
      }
    }
    

    …where 10.100.100.2 is your local IP on the VPN and 3000 is the local port your service is listening on, and 80 is the public port your nginx server listens on. Everything that hits your nginx server at http://yourserver.com/ will proxy back to your local service at http://10.100.100.2:3000/. Depending on what you’re hosting, you may need to add some things to the config.