-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add article about the vision behind Cozy Coder project #4
base: main
Are you sure you want to change the base?
Conversation
This is required to get SASS for styles working. Signed-off-by: Jan Ehrhardt <59441+jehrhardt@users.noreply.github.com>
Signed-off-by: Jan Ehrhardt <59441+jehrhardt@users.noreply.github.com>
✅ Deploy Preview for cozycoder ready!
To edit notification comments on pull requests, go to your Netlify site configuration. |
Signed-off-by: Jan Ehrhardt <59441+jehrhardt@users.noreply.github.com>
e6c6348
to
5601f80
Compare
|
||
Let's be honest, AI is totally overhyped right now! Especially when it comes to software development, the main driver is efficiency and the hope for making expensive software developers irrelevant in the near future. In my opinion, that's probably not going to happen. Just look at how bad LLMs still are. | ||
|
||
But this doesn't mean there is no value in AI for software development. The current LLMs can be really useful for developers. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These two paragraphs are in conflict with each other:
You state, "Just look at how bad LLMs still are" and then, "The current LLMs can be really useful for developers."
One side states that they are not good while the other states they are rather useful.
Reading the rest of the article, I have a feeling the point you are trying to convey here is the potential of LLMs to be very helpful for developers. If so, perhaps trying to use "potential" in the sentence to drive this point home?
The obvious problem with chat is that everyone is just treating it as chat between a human and a bot. But that's not true. Each time you interact with Copilot or other LLMs the full conversation is send to the LLM as a context. There is no reason, why this context couldn't be treated like a Git history instead. | ||
|
||
Let's imagine a tool that gives developers more control and go through the above example: | ||
You start a conversation with a LLM to generate you new project. Of course, it will also run the generated code on your local machine automatically to prove it is working. Once your initial project setup is done, a pull request is provided that can be reviewed by your colleagues. Of course, your colleagues will have access to chat, which is linked in the pull request. Next time one of your colleagues wants to setup another project, they just fork your conversation at some point and continue a new conversation from that point on instead of talking to the LLM from scratch. Since new projects are created frequently in your company, someone is taking your conversation and introduces some parameters to it to use it as a template. Now you have a reusable conversation template that you can simply trigger from your terminal, provide few details (e.g. a project name) and get the code generated. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You start a conversation with a LLM to generate you new project
I think you mean one of:
You start a conversation with a LLM to generate your new project
You start a conversation with a LLM to generate a new project
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You start a conversation with a LLM to generate you new project. Of course, it will also run the generated code on your local machine automatically to prove it is working. Once your initial project setup is done, a pull request is provided that can be reviewed by your colleagues. Of course, your colleagues will have access to chat, which is linked in the pull request. Next time one of your colleagues wants to setup another project, they just fork your conversation at some point and continue a new conversation from that point on instead of talking to the LLM from scratch. Since new projects are created frequently in your company, someone is taking your conversation and introduces some parameters to it to use it as a template. Now you have a reusable conversation template that you can simply trigger from your terminal, provide few details (e.g. a project name) and get the code generated. | |
You start a conversation with a LLM to generate you new project. Of course, it will also run the generated code on your local machine automatically to prove it is working. Once your initial project setup is done, a pull request is provided that can be reviewed by your colleagues. Of course, your colleagues will have access to chat, which is linked in the pull request. Next time one of your colleagues wants to setup another project, they just fork your conversation at some point in the context and continue a new conversation from that point on instead of talking to the LLM from scratch. Since new projects are created frequently in your company, someone is taking your conversation and introduces some parameters to it to use it as a template. Now, you have a reusable conversation as a template that you can simply trigger from your terminal, provide few details (e.g. a project name), and have the code generated. |
Added a few things to enhance what points you are talking about. Additionally, some grammar (commas) to help with fluidity.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"Of course, your colleagues will have access to chat, which is linked in the pull request".
The "Of course," construct feels a bit out of place. I would maybe suggest
"Additionally, your colleagues will have access to chat, which is linked in the pull request"
That or "plus" if you want to make it more casual.
Let's imagine a tool that gives developers more control and go through the above example: | ||
You start a conversation with a LLM to generate you new project. Of course, it will also run the generated code on your local machine automatically to prove it is working. Once your initial project setup is done, a pull request is provided that can be reviewed by your colleagues. Of course, your colleagues will have access to chat, which is linked in the pull request. Next time one of your colleagues wants to setup another project, they just fork your conversation at some point and continue a new conversation from that point on instead of talking to the LLM from scratch. Since new projects are created frequently in your company, someone is taking your conversation and introduces some parameters to it to use it as a template. Now you have a reusable conversation template that you can simply trigger from your terminal, provide few details (e.g. a project name) and get the code generated. | ||
|
||
Of course, not every company creates so many new projects. But this is just one example to show what could be possible, if we were using LLMs more efficiently. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In general, these last two paragraphs are very important to the article! It really drive home the potential of Cozy Coder, so I think it makes sense to make it easy to understand for the reader.
One thing that makes this paragraph not-so-easy to read is the over use of Of course
as a construct. I would suggest only using the same construct once in a paragraph (unless your are trying to convey a feeling of 'sluggish repetition' - which I don't believe you are, here).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I may suggest here:
Certainly,
for a more formal but soft tone.
Obviously,
for a more informal but straight-to-the-point tone.
|
||
The above example gives a rough idea of what you as a software developer should be able to do with Cozy Coder. But to make the tool really good, it obviously must run on your laptop. | ||
|
||
Many developers use CLIs and they can easily be used within scripts to automate tasks. As Cozy Coder is designed for developers, it must provide a proper CLI. But also the access to the file system is important to directly modify local files, if you want to. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Many developers use CLIs and they can easily be used within scripts to automate tasks. As Cozy Coder is designed for developers, it must provide a proper CLI. But also the access to the file system is important to directly modify local files, if you want to. | |
Many developers use CLIs and they can easily be used within scripts to automate tasks. As Cozy Coder is designed for developers, it must provide a proper CLI. Also, access to the file system is important to directly modify local files, if you want to. |
There is no contradiction necessary :) only additions. This makes it feel easier to read.
One last remark about "if you want to". If the point you are trying to convey is really "only if they user really wants to allow the LLM access to the file system" then keep it.
If the point you want to convey is, "when I need the LLM to change things" it is better as ", when you want it to."
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
All in all! Super cool article!!
|
||
Cozy Coder will of course not be limited to local LLMs, but they will always be the preferred choice and well supported. Ideally we will all together achieve the usage of local LLMs, where possible and only use the cloud based ones, where necessary. | ||
|
||
Of course, Cozy Coder will also have a server side part. But it will be mainly for enabling sharing and collaboration as well backups of data. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Double use of "of course". If I may suggest swapping one of them out for a synonym :)
Co-authored-by: Alexander <alexander.piercey@gmail.com> Signed-off-by: Jan Ehrhardt <59441+jehrhardt@users.noreply.github.com>
Co-authored-by: Alexander <alexander.piercey@gmail.com> Signed-off-by: Jan Ehrhardt <59441+jehrhardt@users.noreply.github.com>
No description provided.