edits
continuous-integration/drone/push Build is passing Details

This commit is contained in:
James Ravenscroft 2023-01-15 21:35:51 +00:00
parent c368c78dcd
commit 945749f03a
1 changed files with 1 additions and 1 deletions

View File

@ -33,7 +33,7 @@ I'm pretty rubbish at #weeknotes but I want to try and get better at them.
- I've subscribed to [noted.lol](https://noted.lol/), a blog and newsletter all about self-hosted software. It's a great little newsletter and I've already found a couple of interesting packages through it (see below) - I've subscribed to [noted.lol](https://noted.lol/), a blog and newsletter all about self-hosted software. It's a great little newsletter and I've already found a couple of interesting packages through it (see below)
- I've been reading the BCS ITNow magazine from Winter 2022 that's been sat on my side-table for a couple of weeks. There are quite a few articles that seem to shower the silicon valley ultra-capitalistic view of Web3 in glory but then there's a poll about how most BCS members asked about crypto and NFTs thought it was garbage. A mixed bag! - I've been reading the BCS ITNow magazine from Winter 2022 that's been sat on my side-table for a couple of weeks. There are quite a few articles that seem to shower the silicon valley ultra-capitalistic view of Web3 in glory but then there's a poll about how most BCS members asked about crypto and NFTs thought it was garbage. A mixed bag!
- ChatGPT is everywhere at the moment, it's actually quite irritating. There are lots of predictions that it could lead to a dark internet full of generated content but I fear that it's already led to a dark internet full of terrible takes on ChatGPT. So many articles are variations on "here is a screenshot of a prompt I fed into ChatGPT and here is the output. Isn't it clever?" Here are a couple of interesting articles: - ChatGPT is everywhere at the moment. There are lots of predictions that it could lead to a dark internet full of generated content but I fear that it's already led to a dark internet full of terrible takes on ChatGPT. So many articles are variations on "here is a screenshot of a prompt I fed into ChatGPT and here is the output. Isn't it clever?" Here are a couple of actually interesting articles about it:
- [Unskilled Cybercriminals May Be Leveraging ChatGPT to Create Malware](https://www.infoq.com/news/2023/01/chatgpt-creating-malware/) - people with limited or no software development background have been using ChatGPT to develop malware tools. Whilst I don't buy that developers will be out of a job any time soon, the system can produce some useful code snippets that someone could feasibly string together into a program. - [Unskilled Cybercriminals May Be Leveraging ChatGPT to Create Malware](https://www.infoq.com/news/2023/01/chatgpt-creating-malware/) - people with limited or no software development background have been using ChatGPT to develop malware tools. Whilst I don't buy that developers will be out of a job any time soon, the system can produce some useful code snippets that someone could feasibly string together into a program.
- [China, a Pioneer in Regulating Algorithms, Turns Its Focus to Deepfakes - WSJ](https://www.wsj.com/amp/articles/china-a-pioneer-in-regulating-algorithms-turns-its-focus-to-deepfakes-11673149283) - apparently China are already looking into regulating generative models. I think the cat is out of the bag on this one, you can't contain digital assets that have leaked onto the internet. However governments could, presumably, limit access to large volumes of GPUs needed to train LLMs (and even infer on bigger models). - [China, a Pioneer in Regulating Algorithms, Turns Its Focus to Deepfakes - WSJ](https://www.wsj.com/amp/articles/china-a-pioneer-in-regulating-algorithms-turns-its-focus-to-deepfakes-11673149283) - apparently China are already looking into regulating generative models. I think the cat is out of the bag on this one, you can't contain digital assets that have leaked onto the internet. However governments could, presumably, limit access to large volumes of GPUs needed to train LLMs (and even infer on bigger models).