healthcare textiles & laundry
healthcare textiles & laundry
By Gregory Gicewicz
The Man No Algorithm Would Have Hired
I want to tell you about Maurice.
Maurice grew up in North Lawndale, one of Chicago’ s most underserved neighborhoods. He had a good job once— at O’ Hare. Then one legal mistake cost him that career. After that, door after door closed. He applied everywhere. Nobody would take a chance on him. Fillmore did. We put him in the soil sort room— the hardest job in a healthcare laundry. You’ re handling incoming bags of soiled hospital linen. It is hot, heavy, and most people don’ t last. Maurice showed up every single day. Within months, he was promoted to lead. Then to production supervisor. Today he runs the production floor before most people’ s alarm clocks go off.
He said something to me once that I have never forgotten:“ I want to be the kind of leader I needed.”
I run a healthcare laundry operation in Chicago. We process soiled linen from hospitals. It is physical, unglamorous, essential work. And I am less afraid of artificial intelligence than most people I know who work in knowledge industries. Maurice is part of the reason.
Because no algorithm would have hired him. No model trained on hiring data would have seen what we saw. And no AI, however sophisticated, is capable of sitting across from a man rebuilding his life and recognizing— in the way one human being recognizes another— that there is something worth betting on.
What the Floor Teaches The panic around AI comes, I think, from a category mistake. We are confusing mechanism with mission. Mechanisms change. Missions endure.
We’ ve integrated AI into planning, scheduling, compliance
A nurse had specifically requested that the laundry team come by. She wanted them to see what they do for her patients. Maurice walked into that unit and saw, for the first time, a patient in one
of his gowns. Wearing something that Maurice’ s hands had folded hours earlier.” documentation, and operational modeling at Fillmore. The productivity gains are real. Tasks that once took days now take hours. But we are also learning what AI actually requires in practice. Every output demands human review— not as a formality, but because the technology cannot yet be trusted without it. That will improve. It will not become judgment. It will not become wisdom. And it will not become human.
Here’ s a concrete example. We recently built a distribution model for a major hospital curtain program— processing tens of thousands of curtains across dozens of Chicago-area hospitals. AI was instrumental. We used it to run scenarios, stress-test assumptions, calculate routing logistics, and project costs across a complex variable set. It accelerated work that would have taken weeks.
But every critical decision required human judgment. Which hospitals needed premium pricing due to distance and complexity? How do you build a startup schedule that doesn’ t overwhelm the plant or the client? How do you structure spare curtain inventory so hospitals aren’ t left exposed? AI gave us raw material. Experience, relationships, and judgment shaped it into something real.
There are things AI cannot touch at all. It does not walk the floor and sense when morale is slipping. It does not invest in a person society has written off and help them discover their own capacity for leadership. It does not comprehend the moral weight of delivering hygienically clean linen to the sickest people in our community.
And it does not bear responsibility. Responsibility means someone can be summoned. Someone who stands behind the work, absorbs the consequences, owes something to the patient at the other end of the supply chain. That is not a limitation of current AI that future versions will overcome. It is a categorical distinction. Responsibility requires a moral agent. Tools, however sophisticated, are not moral agents.
The Gown
A few months into his time at Fillmore, Maurice took a tour of one of the hospitals we serve. He had been folding gowns for months— the same blue-and-green cotton-poly blend, size large, eight hours a day. Fold. Stack. Repeat. He was good at it. But if he was being honest, the work felt invisible. He had no idea where the gowns went or who wore them.
Then the hospital manager asked if the group wanted to see the ICU. A nurse had specifically requested that the laundry team come by. She wanted them to see what they do for her patients. Maurice walked into that unit and saw, for the first time, a patient in one of his gowns. Elderly. Fragile. Connected to monitors. Wearing something that Maurice’ s hands had folded hours earlier.
He didn’ t say much on the way back. But something had changed. He understood, in a way that no orientation video or mission statement could have conveyed, that his work was not invisible. It was intimate. It was the difference between a patient lying in clean, safe linen— or not.
That is the moral weight I am talking about. That is what AI does not comprehend. And that is why the mission of our operation is not reducible to any mechanism, however powerful.
Mission Over Mechanism What makes AI distinctive is not that it replaces certain tasks
36 march-april 2026 • www. healthcarehygienemagazine. com