Memoria [EN] No. 99 | Page 23

I remain profoundly grateful to have had the opportunity to participate in FASPE’s fellowship program, where we confronted the role that business professionals played in enabling the atrocities of the Holocaust. That experience impressed upon me that ethical responsibility is not a single moment of reckoning but an ongoing practice of discipline—one that demands we examine even small decisions with humility, clarity, and care.

The reflections that follow were written shortly after that 2019 trip. In the years since, the questions of professional responsibility have only become more urgent. The rapid emergence of artificial intelligence has expanded our capabilities as leaders while simultaneously amplifying the consequences of our choices. We now operate in an environment where decisions are made faster, impacts scale more widely, and ethical blind spots can propagate more quickly than ever before.

As we navigate this new landscape, the lessons of history—and the warnings they carry—remain indispensable. They remind us that technology does not absolve us of responsibility; instead, it heightens the need for vigilance, honesty, and moral imagination. My hope is that this article serves as a small reminder that ethical reflection is not a luxury reserved for moments of crisis but a habit we must practice in the ordinary cadence of our work.

I recently faced an ethical dilemma at work that almost slipped under my radar. Because the stakes were relatively low, I initially did not even recognize it as worthy of ethical consideration. Yet, as we learned during the FASPE trip, small decisions we make as professionals can have enormous consequences, even, or especially, if left unexamined.

After graduation from my MBA program, I joined a tech company that provided sales and marketing automation software to businesses. One of my primary projects was to research whether our company should sell its products in additional international currencies, and, if so, which currencies should be prioritized. As part of this process,

I had to design the operational plan for implementing these potential changes. My superiors then expected me to recommend

a strategy based on my research for approval by more senior executives. It was, by most standards, a normal initiative for a mid-level manager.

When assigned the task, my education and experience kicked in. attacking the problem the same way that many business leaders would:

I studied what our peer companies were doing and dove into the internal data to see what had happened when we had previously opened sales in other currencies, Using this method, I created a financial model for potential future results and started previewing my recommendations to leaders throughout the company.

The data I surfaced made me feel certain that I had an airtight argument. I looked at five different case studies in our company’s history to create reasonable comparisons for what we should expect to happen to revenue and costs if we took this course. The numbers looked strong. I was confident. In fact, I was ready to be asked to lead the implementation of my recommendation (fingers crossed).

When I shared my findings with another colleague, however, he mentioned, offhandedly, a scenario that I had not yet considered. Though impressed by my research and, on the whole, supportive of my recommendations, he encouraged me to look at another metric using

a different case study. No problem,

I thought. Later, when I ran those numbers on my own, the scenario and implied future results looked disappointing. They did not support my original recommendation. In fact, they pointed in the opposite direction.

While I now recognize this to have been a dilemma in the making, at the time I quickly decided to forget his scenario and move forward without it. I had five solid case studies supporting my recommendations—why would this one deviation mean anything? Besides, I wanted the company to adopt my ideas and gain the trust of my supervisors, gain the responsibility to lead an important change. One PowerPoint slide did not seem worthy of weighty ethical deliberation. As a result, I just went back to polishing my final presentation for the senior executives. It was showtime.

But a few days later, a feeling began gnawing at the back of my mind. Was my pride—and my desire to succeed—getting in the way? Was my personal attachment to a specific outcome clouding my judgement about what was best for the company? Was my bias toward action preventing me from rigorously evaluating the decision?

23

I remain profoundly grateful to have had the opportunity to participate in FASPE’s fellowship program, where we confronted the role that business professionals played in enabling the atrocities of the Holocaust. That experience impressed upon me that ethical responsibility is not a single moment of reckoning but an ongoing practice of discipline—one that demands we examine even small decisions with humility, clarity, and care.

The reflections that follow were written shortly after that 2019 trip. In the years since, the questions of professional responsibility have only become more urgent. The rapid emergence of artificial intelligence has expanded our capabilities as leaders while simultaneously amplifying the consequences of our choices. We now operate in an environment where decisions are made faster, impacts scale more widely, and ethical blind spots can propagate more quickly than ever before.

As we navigate this new landscape, the lessons of history—and the warnings they carry—remain indispensable. They remind us that technology does not absolve us of responsibility; instead, it heightens the need for vigilance, honesty, and moral imagination. My hope is that this article serves as a small reminder that ethical reflection is not a luxury reserved for moments of crisis but a habit we must practice in the ordinary cadence of our work.

I recently faced an ethical dilemma at work that almost slipped under my radar. Because the stakes were relatively low, I initially did not even recognize it as worthy of ethical consideration. Yet, as we learned during the FASPE trip, small decisions we make as professionals can have enormous consequences, even, or especially, if left unexamined.

After graduation from my MBA program, I joined a tech company that provided sales and marketing automation software to businesses. One of my primary projects was to research whether our company should sell its products in additional international currencies, and, if so, which currencies should be prioritized. As part of this process,

I had to design the operational plan for implementing these potential changes. My superiors then expected me to recommend

a strategy based on my research for approval by more senior executives. It was, by most standards, a normal initiative for a mid-level manager.

When assigned the task, my education and experience kicked in. attacking the problem the same way that many business leaders would:

I studied what our peer companies were doing and dove into the internal data to see what had happened when we had previously opened sales in other currencies, Using this method, I created a financial model for potential future results and started previewing my recommendations to leaders throughout the company.

The data I surfaced made me feel certain that I had an airtight argument. I looked at five different case studies in our company’s history to create reasonable comparisons for what we should expect to happen to revenue and costs if we took this course. The numbers looked strong. I was confident. In fact, I was ready to be asked to lead the implementation of my recommendation (fingers crossed).

When I shared my findings with another colleague, however, he mentioned, offhandedly, a scenario that I had not yet considered. Though impressed by my research and, on the whole, supportive of my recommendations, he encouraged me to look at another metric using

a different case study. No problem,

I thought. Later, when I ran those numbers on my own, the scenario and implied future results looked disappointing. They did not support my original recommendation. In fact, they pointed in the opposite direction.

While I now recognize this to have been a dilemma in the making, at the time I quickly decided to forget his scenario and move forward without it. I had five solid case studies supporting my recommendations—why would this one deviation mean anything? Besides, I wanted the company to adopt my ideas and gain the trust of my supervisors, gain the responsibility to lead an important change. One PowerPoint slide did not seem worthy of weighty ethical deliberation. As a result, I just went back to polishing my final presentation for the senior executives. It was showtime.

But a few days later, a feeling began gnawing at the back of my mind. Was my pride—and my desire to succeed—getting in the way? Was my personal attachment to a specific outcome clouding my judgement about what was best for the company? Was my bias toward action preventing me from rigorously evaluating the decision?