

If you end up integrating LLMs in a way where it could impact patient care that’s actually pretty dangerous considering their training data includes plenty of fictional and pseudo scientific sources. That said it might be okay for medical research applications where accuracy isn’t as critical.
It could be the business was borderline insolvent and cash from the checks acted as a short term loan from Walmart.
That or he was trying to create the create the illusion of cash flow in order to get the business to qualify for certain kinds of loans. The money from the loans he could subsequently embezzled. Then if the company went bankrupt the creditors would be at a loss.