Running LLM Locally

In order to learn the AI hype such as ChatGPT today, a start would be to run LLM locally. MIT published this article on doing LLM on a laptop.

Someone compares the storage needed to run these with having offline Wikipedia, which is quite comparable.

This entry was posted in Computer Science. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

*

This site uses Akismet to reduce spam. Learn how your comment data is processed.