I’m used to seeing articles about AI being used for either highly scientific uses or for generating semi-entertaining nonsense. For a personal business involving managing appointments, documenting meetings, tracking payments etc, can AI help with any of that? Other things include undertaking CPD training, occasional advertising as well as maintaining a website from time-to-time.
The people I know who don’t think AI has any use for them belong in this category and work in the area of mental health, yoga teaching / training, nursing and massage therapy.
Wrong question. “I have a solution (‘AI’), what’s the problem it should solve?” This is the path towards micromanaging stuff that’s not core to the enterprise.
Instead, try to identify specific problems in the specific context, or factors that are most relevant for success. Then see what the solution could be. That solution might be “AI”, or a bunch of sticky notes, or whatever else.
Other than that: Wherever you use a new tech like ‘AI’, also consider the risks. For example, do you really want to outsource part of your customer relations to an unpredictable thing that sends them the implicit message that you don’t care to directly communicate with them? Etc.
LLM are used particularly to process big amounts of text. I remember my first encounter with it in 2009, somebody giving a talk about observing topics on Twitter, e.g. to track the source of fake news or figure out why some particular topic became viral.
You might already be using it regularly with a translation tool. Yesterday I just saw a foss app called receipt-wrangler, which uses LLM to parse shopping receipts, because a simple scan and ocr would still leave you with a highly unstructured heap of text, which is hard to parse into anything useful.
AI is best used for creativity. It’s not precise, and it should not be relied upon for executing important tasks.
My advice for a small business owner would en to use AI to bounce ideas off of. Ask it for new ideas. Tell it things about your business, the situation, and ask it for potential threats to your model, or potential improvements to your process.
But treat it as a consultant. Meaning you are still the decision-maker, and it’s your job to evaluate its ideas.
It’s creative and highly error-prone. So it’s good for brainstorming. Not good for precision planning or execution.
If you tell me what line of business you’re in, I can provide you with some ideas about how I would use AI to help with that business.
Some things I have used AI (ChatGPT 4) for:
- I ask it questions about how to use specific software tools and libraries
- I had it develop a plan for dealing with mold in my apartment (since the mold itself was making it hard to think)
- I describe a concept and ask if there’s a term for that concept. This helps me find online discussions by humans about that topic. I trust humans for accurate information, use AI to help me find the search terms
I wouldn’t wish mold expousure on anybody. I hope you make a full recovery.
Those are good suggestions, thank you. I’ll likely be retiring soon but the people I’m thinking of are some psychotherapists I know. It’s very interpersonal and thankfully a relatively no-nonsense profession. They don’t want technology between themselves and their clients, that’s for certain.
Black hat and Defcon just ended and I’ll share my impression from LLM related talks given there. Microsoft VPs charged additional money to CISOs attending the summit talking about how AI will disrupt and be the future and blah blah magical thinking.
Meanwhile Microsoft engineers and others said things like “this is logarithmic regression for people who are bad at math, and is best for cases where 75% accuracy is good enough. Try to break use cases into as many steps as possible and keep the LLM away from any automation that could have any consequences. These systems have no separation between the control plane and user input, which is re-exposing us to problems that were solved 15 years ago.”
I think there are some neat possibilities that are lost in marketing hype as venture capitalist anger grows that they might have been scammed by yet another hammer in search of nails.
Yes of course it can help you. The lack of imagination in this thread is truly astounding.
You have an assistant with you that can instantly answer your questions and help you develop your business:
- “what’s the most efficient way to track appointments for me on Linux desktop program with minimal budget and I have 4-6 daily appointments. My key features are reminder 30min before appointment and ability to put notes for each appointment”
- “help me optimize my meeting structure. I’m in X niche and currently I have 30minute daily meetings that don’t follow any structure, what are some de facto meeting structures and post meeting operations in this industry?”
I’m not directing this at OP but to all of the naysayers in this thread - if you can’t find use for a tireless, 20$/mo assistant that will instantly answer your questions then you should not be owning any business or leading anything for that matter.
From your examples I suppose you’re saying that something like ChatGPT might help with planning, examining processes for optimisation or making long-term choices rather than day-to-day tasks.
I disagree with you that people don’t belong in business if they don’t see value in AI. I know someone who cleans for a living. They get enough work from the business directory and the system they have seems to be simple enough already to be a candidate for optimisation.
Eh, it’s the same thing. If LLM can help me design a better process for my day-to-day tasks even if it’s not part of the process literally it’s still part of the process. Just like any growth like reading a book is part of the business process.
Not even going to touch your second paragraph and it’s completely unrelated. Cleaning for a living is not “running a small business”.