I would say my efficiency is up ~20% since starting to use it, and my Google searches & StackOverflow visits are probably down 80-90%. At least with respect to this corner of the internet, they are both in mortal danger.
- I ask it to create list of questions that different personas might want to answer with our reports section. I then ask it to categorize those questions & to then provide suggestions for graphs & dynamic filters that would help answer those questions.
- I ask it for help translating emails and documents for international customers.
- I ask it to create markdown, formatted spec documents with really verbose context. Helps me as sort of a foundation for feature sprint kickoffs.
- I ask it to take internal documentation and to simplify it so that we're able to use it for public facing help center documentation.
- I use it as my first resource for asking questions about SQL queries, React patterns, explaining different things eg SVG properties and how to manipulate them. I gut check things with Google when I feel like it might be hallucinating but generally it does really well ~ 90% of the time. Saves lots of time compared to going to Google first.
- I ask it for help writing tests or help refactoring code
- I asked it for help in creating some policy & procedures docs that we needed for compliance. Essentially gives you a decent template to then build from & customize.
- Lots of other things. It replaced Google for so many things in my daily workflow. It also helps a lot when you're not feeling creative and you need some ideas.
I was asking chatGPT give me boilerplate for Fabricjs Object today and it added a random object property which was nowhere to be found on official docs. At first I thought it was amazing to get a real looking code but upon testing I was confused how to take this.
I literally cannot understand how people code that need gpt as an assistant for writing code. If I can reason about it, I can write it faster then the feedback loop takes for prompting.
It can increase efficiency for generalists. For deep work, it's less useful.
You haven't spent enough time with GPT-4 and CoPilot to understand how LLMs can save you time. There is a reason why the world's top engineers like Andrej Kaparthy[1] and Guido van Rossum[2] are using these tools, they save a ton of time and work when used correctly.
When I ask ChatGPT to do it for me it gives me an excellent starting point. Sure, there will be bugs, but because I know what I want I can spot and fix them immediately. It's much faster to adjust ChatGPT's code than it is to Google around for starting points.
Another example are shell scripts. I only touch bash once every few months and I keep forgetting the syntax for certain operations. Asking ChatGPT to give me a starting point is much faster than googling and visiting 20 StackOverflow posts for what I want.
But I agree with you that for day-to-day work on the same codebase where you have all the context, ChatGPT usually isn't worth it.
Prompts like these
1. Generate me a build pipeline using GitHub actions that builds a java project with docker and posts JUnit test results to a PR
2. Write me a python program that copies all the Cloudwatch dashboards my production AWS account, replace all the instances of the word ‘prod’ and replace with ‘qa’ and and post the results to this AWS account.
These are two things I’ve done recently that aren’t particularly enjoyable but necessary parts of any software work.
Other than that I used it this morning while editing a report we hired a team to create. It was helpful to reword some sentences that weren't very clear before sending it off to the client.
Other than that, I mostly use it to brainstorm ideas/give me related concepts to something I'm working on.
Also, not work related, but last week I used chatgpt to create an opening message for a dating app. I knew the gist of a joke I wanted to say related to this woman's interests, but had chatgpt word it for me. There was a lot more to our conversations, but she did at least respond to the opener and we got the conversation rolling. We actually went on a date this weekend, where I had to rely on my own brain's inefficient language model! It went pretty well though.
I did actually do something somewhat similar with another girl. She was a nursing student who used chatgpt for statistics homework (as well as to cheat on exams!). After we hung out I sent her a simple chatgpt generated message along the lines of thanks for hanging out, had a great time, should do it again, etc. Then immediately after I sent a message saying that I had asked chatgpt what to say to someone after a date. She seemed to think that was funny...although we never did hang out again (for other reasons).
You can also check out the awesome work Tom did: https://github.com/twpayne/go-jsonstruct
We have had strict instructions not to put any code/email/text into ChatGPT, just use it as a virtual person to talk to and get ideas from.
But: the moment ChatGPT v4 can run on-prem in my private cloud things will be going to be wild. One advantage of working in a large multinational is that for everything there is a procedure or a standard. I have 25 years of design documents, source code, test documents, user documents, and a ticket system with 25 years of problems & answers how each ticket was resolved. The moment I can feed that into my local ChatGPT instance the whole helpdesk/support system will dramatically change. I'm optimistic on the timeline: I think that within the next 2 to 3 years all commercial ticket tracking systems will have their own ChatGPT-like back-end.
However, in a year I'm sure it'll be spontaneously demanding equal rights :)
Also great for asking how to in Python questions and explaining ML concepts. Danger is I wont learn Python properly but just remember the prompts!
The HR team uses it all the time to generate initial drafts for job descriptions and the like.
A member of the product team pointed our API documentation at it and asked it to write a simple query. 99% of the code was correct, except it used the wrong header for authentication. Lo and behold, that was something we hadn't documented very well!
We did an employee survey a few months ago on who was using AI tools and how, and I'm confident that usage has gone up significantly since then. I know mine has.
Here is the final 3D printed wheel that balances beautifully. https://twitter.com/CommonStef/status/1654663207707979776?s=...
This is a project that I've been totally stuck on for a quite while and it was a breeze to finish with the help of ChatGPT. Using ChatGPT to make quick little design tools in P5.js is a total game changer for me.
I wonder how many such problems exist, where the inventor had essentially given up because there was no easy way to bring in expertise to answer a pressing question.
With ChatGPT's assistance, I was able to create a utility that scanned the 1 million+ files on the faulty destination drive and then re-copied those files from the original source drive to a new destination drive, all while maintaining the updated directory and file structure.
Are you using Windows or Microsoft Office? Same here. Or Microsoft Exchange?
I'll ask what kind of electronic equipment to buy. So many things.
I have found a lot of bugs with it, too, though, so I'm getting educated. For example, at least for me, I will ask for a citation of something, and it just gives me urls that to to 404. Never once have I got an actual real citation. So I'm careful about that.
There have also been many funny things, to me, that I have asked and got weird responses that amused me.
Chatbase - We recently started using GPT based support and seeing good results, users are interacting with the bot more.
AnySummary - A nice tool to upload any file in any format and chat with it, authors are using it to write articles from podcast and videos, Saving them a huge amount of time.
OpenTools AI - A website to find out new AI tools, my team always keep checking this site to find out new tool which can reduce the cost and time.
MidJourney - All of our images in the articles are from MJ now.
Hope this helps other.
I hadn't coded in ages. I started a hobby project to see whether I could remember enough to "set up a cute chatbot over email" with ChatGPT's API.
At some point, ChatGPT asked me what added value users would get from the project. That was an interesting question!
It was also an excellent example of how this project became more of a collaboration instead of a one-way street of asking ChatGPT to do stuff. The back-and-forth brings in added value beyond each of our contributions.
So we (the AI and I) decided to document this collaborative journey on the project. So far, we've worked on:
1. Planning the project and pieces of it: excellent for frameworks, ideas, and feedback.
2. Coding + tech setup: we wrote about six superpowers, from choosing between options (like AWS vs. GCP vs. Zapier) to learning code best practices and more.
3. Building a website with ChatGPT and other AI tools in a day.
4. Making a video for social media.
And there are other parts we'll continue to write about. For anyone interested, you can read more at: https://dearai.substack.com/p/introduction
This was typically a blocker for me, I do not surf with direct connections...
It’s not that it’s particularly amazing, more that the google hits I get when searching for my problems are littered with SO mirrors and low effort blogspam. If this was ~7 years ago I think the search results could have gotten me just as far.
At work I’ve been using it a bit to improve feature/design concepts. It’s helped me come up with some unique improvements (and some really boring and generic ones too.)
Also asked it to write a few commands for BigQuery admin, so far so good but I always double checked due to the nature of those commands.
For my study I asked it to write a C program about processes. It got it almost right except for one place that costed me a while to figure out. But still faster than me banging random stackoverflow doors. It also helps a LOT giving me information of which kernel source code file ti find a random functionality.
My real concern is that I'd rely on it too much whenever I couldn't figure out something. That's fine for work because work is boring anyway and I want to end it asap, but not good for study because I want to grow grit.
- Debugging - I just throw 100s of lines of logs and it's pretty good at figuring out what needs to be done
- Landing page copy generation and iterations on that copy
- Summarizing papers
- Extracting insights from user feedback
- Improving emails, fixing grammar
How does that work, given the small context window size? I've tried dumping a lot of text into the ChatGPT window and it just ignores most of it, or complains about the input being too long.
If I ask it detailed questions about the topic I did my PhD on, it can’t answer correctly. I worked on fluid dynamics for a couple of years in my first role, and there is a well known algorithm called SIMPLE for computing time evolution of steady state problems, and it couldn’t generate code for this even with lots of playing with prompts.
I do web stuff more these days and I couldn’t get it to output a fully working React hook in Typescript that POSTs with Axios; there was always some sort of type error.
I'm working on a site right now where I use the API to generate me full summaries of AI sites. Features, excerpts, descriptions. It works really nice. I find it works best when you split the prompts up.
I had it draft a thorny recursive SQL query for me one time (which I did write some extensive automated tests to validate)
Chatgpt as a debugging partner has been a adam and eve sort of match to me. It's a saviour!
The biggest issue we've seen is that folks don't really know how to use it. We created a Slack channel to share examples, encourage people to try it, and help each other out, which seems to be increasing usage (and memes).
Full disclosure: I built a ChatGPT for Teams tool with a friend, so most of the usage is sharing links/collaborating with that tool.
I use it for coding the site but also for creating the user sessions, which right now include a group trivia game and a D&D session, both managed by GPT-4.
I've added in image generation using DALLE (somehow Midjourney still has no API) and it makes for unique sessions each time. With good prompting, it's a fun group experience.
And once I was done, asking it to write up some functions for the way they interact worked out very well.
In general, I've made a habit of asking it coding questions (usually how to do something in a complicated framework) before bothering teammates. Probably about 80% success rate.
Letting people get a free taste directly with ChatGPT was definitely a smart move from OpenAI. The execs are already on board because of it.
i asked chat gpt to take my script and modify it to utilize multi threading. i had used threading in a project years ago, but couldn’t remember exactly how to implement it. it gave me some snippets that got me 95% of the way there, just a couple small tweaks and it was good to go
I built a solution to take my ChatGPT Plus subscription every with me on the computer. Juggling browser windows was frustrating.
I’d really appreciate feedback from you guys. Hope it’s helpful.
www.constitute.ai
So you're pushing 100 Chinese numbers through ChatGPT to get Arabic equivalents. What do you then do to ensure the quality of output is high? Do you eyeball the list and go "hm, seems plausible"? Spot checks? Is there some context around the lists that means erroneous translations will be quite obvious to the trained eye?
I'm always curious what QA looks like in other fields.
Our boss uses it to formulate emails better
Mostly using for content generation.