Amber Stewart sees what many overlook in artificial intelligence, she said: the human cost of unregulated technology that can manifest as anything from sexist and racist outcomes to outright theft from willing and unwilling members of the public.
“I’m not afraid of the tech,” said Stewart, founder and CEO of GuardianSync. “I’m afraid of unfettered capitalism, of people releasing powerful tools with no accountability.”
In building a fiduciary framework for AI ethics through her startup, her goal is not to eliminate or fight AI but to guide its responsible use.
“I’m not afraid of the tech,” Stewart said. “I’m afraid of unfettered capitalism, of people releasing powerful tools with no accountability.”
Her urgency comes from what she calls “the Wild West of technology.”
“AI is being released into the world with almost no guardrails,” Stewart said. “We’ve already seen chatbots encourage self-harm, children using AI as therapy, and people’s private data being used for training models without their consent.”
A fiduciary framework, she explained, means creating technology with legal and ethical obligations that give people control over how their data is shared.
“Just like a lawyer or doctor has a duty to act in your best interest, AI systems should have a duty of care toward the people whose data they rely on,” said Stewart. “GuardianSync is designing that standard, a framework where AI is not just powerful, but trustworthy by design.”
Before GuardianSync became a full framework, Stewart explored how AI could detect and correct bias in medicine, she recalled.
“I wanted to give clinicians tools to recognize inherent bias in real time,” Stewart said. “If someone is being denied pain management because of their demographic, the system could flag it and guide a more equitable response.”
Her motivation came from experience.
“As a Black woman, I wanted to see clinicians who looked like me so I could feel safe,” she said. “When that wasn’t the case, I thought, maybe I can give them a tool to help them understand what they don’t see.”
That vision evolved into GuardianSync’s larger purpose: embedding fairness and transparency directly into technology.
“Technology has outpaced the rules that protect people,” she said. “AI can replicate value, bias, or harm in seconds. Ethical governance ensures we’re not just creating faster systems, but fairer ones, systems that respect creativity, consent, and human rights.”
A framework for trust
GuardianSync functions as a digital trust layer between people and the technologies they use every day.
“GuardianSync protects human data, things like biometrics, AI outputs, and creative work, through consent, transparency, and accountability,” Stewart said. “It makes sure AI systems treat your data with the same care as a bank treats your money.”
Her system would allow companies to be independently audited and certified through a consumer seal of trust. Organizations that meet ethical standards could display the seal publicly, signaling which platforms protect user data.
“Trust doesn’t come from slogans,” Stewart said. “It comes from systems that make exploitation impossible.”
She is also designing a compliance process, similar to a legal or financial audit, where companies would be reviewed by an independent ethics commission.
“If you build trust into the infrastructure, it benefits both the business and the consumer,” she said.
Making ethics enforceable remains one of her toughest challenges.
“Ethics often lives in theory and not law,” Stewart said. “GuardianSync is working to turn values into standards, into language that regulators, companies, and communities can actually use.”
She envisions a new class of innovators — what she calls “ethics technocrats” — leading the movement toward safer technology.
“I want to build a team that can make this governable,” Stewart said. “What I’ve created could become something governments, defense contractors, and major institutions use to ensure accountability.”
Bias and surveillance in everyday systems
The lack of transparency, she warned, harms marginalized communities already facing bias and surveillance.
“Five or six years ago, facial recognition systems were identifying people of color as gorillas,” Stewart recalled. “That’s horrifying, and it shows how bias is baked into the data these systems learn from.”
Now, those same technologies are embedded in police cameras and doorbell systems.
“What happens when a man of color is automatically flagged as a suspect just because an algorithm says so?” she asked. “The people creating this infrastructure are not thinking about the sexism or racism being embedded into it, but it has real-life consequences for us.”
She also pointed to the rise of devices that quietly collect personal information. Wearable tech, for instance, gathers biometric data most users never realize is being stored or sold.
“If my health data was going to Johns Hopkins for heart research, I’d opt in,” Stewart said. “But most of us have no idea where our information goes. These devices monitor every part of our lives — sleep, stress, heartbeat — and that data can be used to track us, sell to advertisers, or worse.”
Taking KC further ethical technology
Stewart wants Kansas City’s tech and policy leaders to help chart the path forward. She believes the region could become a national hub for ethical innovation.
“I would like for Kansas City to take a chance on something like this, because this is about our families and our communities,” she said. “We need people who understand the dangers of surveillance, data misuse, and unchecked AI to take a look at what I’ve built and help strengthen it.”
GuardianSync’s next steps include completing intellectual property reviews, launching its first fiduciary seal pilot, and expanding outreach to investors and policymakers.
“It’s humbling and hard, but deeply motivating,” she said. “What I’m realizing is that everybody who’s working in this space feels like they’re in new territory. We’re all figuring it out together. The work I’m doing is valid. It’s just a new frontier.”
She is already working with UMKC’s Innovation Center, KCSourceLink, and local groups to refine GuardianSync’s intellectual property and develop partnerships, she said. Stewart also hopes to work with organizations like aSTEAM Village, which helps young people learn digital literacy and STEM skills.
“The goal is to create AI and digital literacy programs for young adults,” she said. “Once the company is more formalized, profits and resources will go back into the community so we can all learn how to navigate this technology safely.”


































