brainsteam.co.uk/brainsteam/content/posts/2023/04/turbopilot/index.md

22 lines
1.0 KiB
Markdown
Raw Normal View History

2023-04-16 18:12:55 +01:00
---
2023-07-09 11:34:44 +01:00
date: 2023-04-16 14:08:55+00:00
2023-04-16 18:12:55 +01:00
description: My local intelligent auto-complete project
2023-07-09 11:34:44 +01:00
draft: true
2023-04-16 18:12:55 +01:00
mp-syndicate-to:
- https://brid.gy/publish/mastodon
- https://brid.gy/publish/twitter
2023-07-09 11:34:44 +01:00
post_meta:
- date
2024-10-28 20:59:46 +00:00
preview: /social/05c819fa16605859eae66fc66c296fef7920121616c4741311f46be357335983.png
2023-04-16 18:12:55 +01:00
resources:
2023-07-09 11:34:44 +01:00
- name: feature
src: images/thesis_mug_small.jpg
2023-04-16 18:12:55 +01:00
tags:
2023-07-09 11:34:44 +01:00
- ai
- open-source
title: Introducing Turbopilot
type: posts
url: /2023/04/16/turbopilot
2023-04-16 18:12:55 +01:00
---
I started TurboPilot over the easter weekend when I was stuck at home bored with COVID. As an AI specialist, I've been following OpenAI, Copilot and all things GPT very closely and I've been enspired excited by all the open activity like [llama.cpp](https://github.com/ggerganov/llama.cpp) which allows you to run large language models locally on CPU. I decided that it might be quite useful to have intelligent code autocompletion that runs locally without sending my data to the OpenAI mothership for analysis and when I'm travelling with limited connectivity like on trains or planes.