Reading view

There are new articles available, click to refresh the page.

As technology evolves, it becomes harder to tell ‘real’ AI from marketing

Woman using laptop with artificial intelligence screen

Technologists say the hazy definition of “artificial intelligence” leaves a wide opening for companies to over-promise or over-market the capabilities of their products – or even render “AI” more of a marketing gimmick than a real technology. (Photo illustration by tolgart/Getty Images) NOTE: The lowercase on “tolgart” in photo credit is CQ per Getty site.

In his college courses at Stanford University, Jehangir Amjad poses a curious question to his students: Was the 1969 moon landing a product of artificial intelligence?

It might sound like a work of science fiction, or time travel, he said, but understanding the history of AI answers the question for them.

“I would actually argue, yes, a lot of the algorithms that were part of what put us on the moon are precursors to a lot of what we are seeing today as well,” said Amjad, a Bay Area technology executive and a computer science lecturer at Stanford. “It’s essentially precursors to the same kind of similar sort of ‘next, next, next generation’ algorithms.”

Amjad poses the question to his students to underline how hard it is to actually define “artificial intelligence.” This has become even more difficult as the technology explodes in sophistication and public awareness.

“The beauty and the dilemma is, ‘what is AI?’ is actually very hard to define,” Amjad said.

That broad definition – and public understanding – of “artificial intelligence” can make it difficult for both consumers and the tech industry to parse out what is “real” AI and what is simply marketed as such.

Swapnil Shinde, the Los Altos, California-based CEO and cofounder of AI bookkeeping software Zeni, has seen it through his investment firm Twin Ventures. Over the last two years, Shinde has seen a huge uptick in companies seeking funding that describe themselves as “AI-powered” or “AI-driven.” The AI market is very saturated, and some “AI companies” in fact just use the technology in a very small part of their product, he said.

“It’s very easy to figure out after a few conversations if the startup is just building a wrap around ChatGPT and calling that a product,” Shinde said. “And if that’s the case, they are not going to survive for long, because it’s not really deep tech. It isn’t solving a very deep, painful problem that was driven by humans for a long period of time.”

The rush to build AI

Since early 2023, Theresa Fesinstine said she has observed a race in the corporate world to introduce AI technologies in order to stay competitive and relevant. It’s when she launched her AI education company, peoplepower.ai, in which she leads workshops, teaches organizations about how AI is built and consults them on which tools might be a good fit for their needs.

In a time where everyone wants to claim the most cutting edge tools, some basic education about AI can help both companies and their employees navigate the technology landscape, the Norwalk, Connecticut-based founder said.

In an effort to look more innovative, companies may tout basic automations or rule-based alerts as exciting new AI tools, Fesinstine said. While these tools do use some foundational technologies of AI, the companies could be overstating the tool’s abilities, she said, especially when they throw around the popular buzzword term “generative AI,” which uses complicated algorithms and deep learning techniques to learn, adapt and predict.

We should doubt wherever we start seeing claims of originality coming from AI because originality is a very human trait.

– Jehangir Amjad, tech executive and Stanford lecturer

The pressure on companies to keep up with the latest and greatest may also lead some organizations to buy new AI software tools, even if they don’t have a strategy to implement and train their employees how to best use it.

“It’s predatory, I would say,” Fesinstine said. “For companies, especially those that are feeling unsure of what AI is going to look like, what it should be, people have a fear of being left behind.”

Some technologists argue that ambiguity around what is or isn’t AI allows for all kinds of tech products to be sold as such. Predictive analytics, for example, which uses data to forecast future outcomes, may be “borderline” AI, said Ed Watal, the Reston, Virginia-based founder of IT and AI strategy consultancy firm Intellibus.

True AI systems use algorithms to sort, analyze and review data, and make informed decisions on what to do with it, based on what humans prompt it to do. The “learning” aspects of these systems are how AI gets smarter over time through neural networks which take feedback and use history to get better at completing tasks over time.

“But the purists, the purists, will argue that AI is only machine learning and deep learning,” he said.

“AI washing”

Though there seems to be an AI-powered company promising to do pretty much any task for you, technologists warn that today’s “real” AI has its limitations. Watal said the industry has seen some “AI washing” or over-promising and over-marketing the uses of AI.

A company that promises that its AI tool can build a website from the ground up could be an example, he said. While you could get ChatGPT or another AI algorithm to generate the code, it can’t create a fully functioning website, he said.

“You wouldn’t be able to do things which require, let’s say, something as simple as sending an email, because sending an email requires a [simple mail transfer protocol] server,” Watal said. “Yeah, you could ask this AI tool to also write the code for a mail server, but you’d still have to host it and run it somewhere. So it’s not as simple as, oh, you click a button and you have an entire app.”

Amjad, who is also the head of AI Platform at generative AI company Ikigai, said companies sometimes over-promise and over-market the ability of AI to perform original, creative tasks.

While artificial intelligence tools are great at pattern recognition, data sorting and generating ideas based on existing content, humans remain the source of original, creative tasks and output, he said.

“People would argue that in the public imagination, AI is creating a lot of things, but really it’s regurgitating. It’s not creating, right?” Amjad said. “And we should doubt wherever we start seeing claims of originality coming from AI because originality is a very human trait.”

It’s definitely not the first time that a new technology has captured the public’s attention and led to a marketing frenzy, Watal said. About a decade ago, the concept of “Web3,” or a decentralized internet that relies on blockchain technology, quickly grew in popularity, he said.

Blockchain technology operates as sort of a public ledger, where transactions and records are kept in an accessible forum. It’s the basis of many cryptocurrencies, and while it has become more mainstream in recent years, it hasn’t taken over the internet as was predicted about a decade ago.

“The cloud” is another example of a technology marketing makeover, Watal said. The concept of remote servers storing information separately from your hardware goes back decades, but after Apple’s introduction of the Elastic Compute Cloud in 2006, every technology company competed to get their claim to the cloud.

Only time will tell if we are overusing or underusing the term artificial intelligence, Amjad said.

“I think it’s very clear that both the hype and the promise, and the promise of applications is actually pretty real,” Amjad said. “But that doesn’t mean that we may not be, in certain quarters, overdoing it.”

Amjad suspects the interest in AI will only continue to rise, but he feels Ikigai’s technology is one that will prove itself amid the hype cycle.

“Yes, it’s come and captured the public imagination. And I’m absolutely thrilled about that part, but it’s something that builds upon a very long tradition of these things,” Amjad said. “And I wish that would help temper some of the expectations … the hype cycle has actually existed in AI, at least a couple of times, in the last, maybe, 50 years itself.”

GET THE MORNING HEADLINES.

Trump tech appointees point to a deregulated industry, tech players say

Donald Trump, then the GOP candidate for president, gave a keynote speech on the third day of the Bitcoin 2024 conference at Music City Center on July 27, 2024 in Nashville, Tennessee. He had previously dismissed cryptocurrency as a “scam” but embraced it in his latest bid for a second term in the White House. (Jon Cherry | Getty Images)

President-elect Donald Trump’s recent appointments and cabinet nominees are pointing to a four-year stint of deregulation in the tech industry, and lots of potential for competitive growth within the industry and globally, tech executives predict.

Trump has made a handful of recent selections, both to existing positions, like chair of the Federal Trade Commission and chair of the Securities and Exchange Commission, and he has created new positions for his term, like the “AI and crypto czar.”

“There appears to be much more cohesion and support from within his camp to support a range of geopolitical, technology and innovation issues that were relegated in importance during his first term,” software founder Yashin Manraj said.

Trump has chosen FTC Commissioner Andrew Ferguson to be the agency’s next chairman, replacing Lina Khan, who fought Big Tech overreach during her tenure. He’s slated to be joined by antitrust specialist Mark Meador in his former position. Together, the pair will likely continue to scrutinize Big Tech agencies, but for issues of “censorship” which was a Republican talking point during the election.

Earlier this month, Trump named cryptocurrency advocate and former SEC commissioner Paul Atkins as his pick for chair of the SEC, and appointed former PayPal executive David Sacks to a new role of “AI and crypto czar,” PBS reported.

The move comes as Trump’s view on digital currencies has evolved. During his first presidency, Trump called it “highly volatile and based on thin air,” but has since changed his tune. In September, he rolled out a new venture to trade crypto, called World Liberty Financial, and said during his 2024 campaign, that he aims to make the U.S. the “crypto capital of the planet.”

And on Dec. 16, Trump met with Japanese-based investment firm SoftBank CEO Masayoshi Son, who announced a $100 billion investment into U.S. projects over the course of Trump’s term, many of which will focus on artificial intelligence.

Appointees setting the tone

These tech industry appointees and connections lean toward “traditional Republican deregulatory instincts,” said Dev Nag, Bay Area-based founder and CEO of AI automation company QueryPal.

It’s a shift toward something Nag calls “techno-pragmatic nationalism,” or a mixture of these Republican deregulation instincts with industry policy that focuses on maintaining the U.S.’ status in the global tech economy.

Ferguson’s appointment to the FTC will likely result in policies that continue to allow large U.S. tech companies to thrive while addressing specific competitive issues.

We’ll also probably see harder barriers against foreign tech competitors, especially China, Nag said.

Manraj, the Eagle Point, Oregon-based founder and CEO of software company Pvotal Technologies, also sees Trump’s appointments as an attempt to focus on growing the local tech economies, rather than the global one.

“These policies will weaken the tech industries of the European Union and many emerging countries, which were hoping for the heightened regulation under Harris to prevent further brain drains and promote [foreign direct investment] in their startup ecosystems,” Manraj said.

There was a lot of technical advancement during President Joe Biden’s term, but stronger tech regulations created some “confusion and hesitation” within the industry, Manraj said.

“Based on the track record of these appointees, we’re likely to see a significant rollback of AI safeguards implemented under the Biden administration, replaced with a framework emphasizing rapid deployment and commercialization,” Nag said.

SoftBank’s investment is a sign that the industry is feeling ready to develop, and that investment dollars are more likely to flow under a Trump administration, Manraj said.

Differences from Trump’s first term

The appointments and tech-industry relationships Trump has developed for his second term appear to make him more prepared to be supportive of tech innovation and growth than his first term, Manraj said.

He’s relying on “a new generation of technocrats,” to enact change, Manraj said, rather than politically driven cabinet advisors from his first time in office.

“The crypto world is reacting positively to it, and many projects treading water for years are finally ramping up hiring and growth locally in the U.S.,” Manraj said.

Nag predicted several potential technological advancements we may see under Trump’s second term, including the relaxation of AI restrictions paired with lots of investments into tech. That may allow for AI to integrate across lots of industries and infrastructures faster than it would have under a Kamala Harris administration.

Nag also noted Trump’s change in attitude toward crypto, saying a friendlier regulatory environment for the digital currencies may position the U.S. as a global leader in the space. We may also see more advancement in semiconductor manufacturing and computing capabilities under more relaxed regulations.

All of these advancements come with important governance considerations, though, Nag said. AI advancements that go so far without safety frameworks can create future problems that we can’t come back from.

“The key challenge for this administration will be maintaining the delicate balance between fostering rapid innovation and ensuring long-term technological resilience,” he said.

GET THE MORNING HEADLINES.

College students ‘cautiously curious’ about AI, despite mixed messages from schools, employers

University of Utah student Rebeca Damico said her professors at first took a hard line against AI when ChatGPT was introduced in 2022, but she and other students say schools have softened their stands as the usefulness – and career potential – of the technology has become clearer. (Photo by Spenser Heaps for States Newsroom)

For 21-year-old Rebeca Damico, ChatGPT’s public release in 2022 during her sophomore year of college at the University of Utah felt like navigating a minefield.

The public relations student, now readying to graduate in the spring, said her professors immediately added policies to their syllabuses banning use of the chatbot, calling the generative artificial intelligence tool a form of plagiarism.

“For me, as someone who follows the rules, I was very scared,” Damico said. “I was like, oh, I can’t, you know, even think about using it, because they’ll know.”

Salt Lake City-based Damico studied journalism before switching her major to public relations, and saw ChatGPT and tools like it as a real threat to the writing industry. She also felt very aware of the “temptation” she and her classmates now had — suddenly a term paper that might take you all night to write could be done in a few minutes with the help of AI.

“I know people that started using it and would use it to … write their entire essays. I know people that got caught. I know people that didn’t,” Damico said. “Especially in these last couple weeks of the semester, it’s so easy to be like, ‘Oh, put it into ChatGPT,’ but then we’re like, if we do it once, it’s kind of like, this slippery slope.”

But students say they’re getting mixed messages – the stern warning from professors against use of AI and the growing pressure from the job market to learn how to master it.

The technological developments of generative AI over the last few years have cracked open a new industry, and a wealth of job opportunities. In California, Gov. Gavin Newsom recently announced the first statewide partnership with a tech firm to bring AI curriculum, resources and opportunities to the state’s public colleges.

And even for those students not going into an IT role, it’s likely they will be asked to use AI in some way in their industries. Recent research from the World Economic Forum’s 2024 Work Trend Index Annual Report found that 75% of people in the workforce are using AI at work, and that some hiring managers are equally prioritizing AI skills with real-world job experience.

Higher ed’s view of AI

Over the last few years, the University of Utah, like most academic institutions, has had to take a position on AI. As Damico experienced, the university added AI guidelines to its student handbook that take a fairly hard stance against the tools.

It urges professors to add additional AI detection tools in addition to education platform Canvas’ Turnitin feature, which scans assignments for plagiarism. The guidelines also now define the use of AI tools without citation, documentation or authorization as forms of cheating.

Though Damico said some professors continue to hold a hard line against AI, some have started to embrace it. The case-by-case basis Damico describes from her professors is in line with how many academic institutions are handling the technology.

Some universities spell out college-wide rules, while others leave it up to professors themselves to set AI standards in their classrooms. Others, like Stanford University’s policy, acknowledge that students are likely to interact with it.

Stanford bans AI from being used to “substantially complete an assignment or exam,” and says students must disclose its use, but says “absent a clear statement from a course instructor, use of or consultation with generative AI shall be treated analogously to assistance from another person.”

Virginia Byrne is an associate professor of higher education and student affairs at Morgan State University in Baltimore, and she studies technology in the lives of learners and educators, with a focus on how it impacts college students. She said the university allows professors to figure out what works best for them when it comes to AI. She herself often assigns projects that prompt students to investigate the strengths and weaknesses of popular AI tools.

She’s also a researcher with the TRAILS Institute, an multi-institution organization aiming to understand what trust in AI looks like, and how to create ethical, sustainable AI solutions. Along with Morgan State, researchers from University of Maryland, George Washington University and Cornell University conduct a variety of research, such as how ChatGPT can be used in health decision making, how to create watermark technology for AI or how other countries are shaping AI policy.

“It’s cool to be in a space with people doing research that’s related, but so different,” Byrne said. “Because it expands your thinking, and it allows us to bring graduate students and undergraduate students into this community where everyone is focused on trustworthiness and AI, but from so many different lenses.”

Byrne hopes that her students can see the potential that AI has to make their lives and work more easy, but she worries that it creates an “artificial expectation” for how young people need to perform online.

“It might lead some folks, younger folks, who are just starting their careers, to feel like they need to use (social media tool) Canva to look totally perfect on LinkedIn, and use all these tools to … optimize their time and their calendars,” Byrne said. “And I just worry that it’s creating a false expectation of speed and efficiency that the tools currently can’t accomplish.”

Theresa Fesinstine is the founder of peoplepower.ai, which trains HR professionals on ways AI can be used efficiently within their organization. This semester, she instructed her first college course at the City University of New York on AI and business, and taught students of all years and backgrounds.

Fesinstine said she was surprised how many of her students knew little to nothing about AI, but heard that many other instructors warned they’d fail students who were found to have used it in assignments. She thinks this mixed messaging often comes from not understanding the technology, and its abilities to help with an outline, or to find research resources.

“It’s a little scary, and I think that’s where, right now, most of the trepidation is centered around,” she said. “It’s that most people, in my opinion, haven’t been trained or understand how to use AI most effectively, meaning they use it in the same way that you would use Google.”

Real-world applications

Shriya Boppana, a 25-year-old MBA student at Duke University, not only uses AI in her day-to-day life for schoolwork, but she’s also pursuing a career in generative AI development and acquisitions. She wasn’t initially interested in AI, she said, but she worked on a project with Google and realized how the technology was set to influence everyday life, and how malleable it still is.

“Once you kind of realize how much that the tech actually isn’t as fleshed out as you think it is, I was a little more interested in … trying to understand what the path is to get it where it needs to go,” Boppana said.

She said she uses some form of AI tool every day, from planning her own schedule, to having a chatbot help decide how students in a group project should divide and complete work, based on their availability. Because she works with it regularly, she understands the strengths and limitations of AI, saying it helps her get mundane tasks done, process data or outline an assignment.

But she said the personalized tone she aims to have in her writing just isn’t there yet with the publicly available AI tools, so she doesn’t completely rely on it for papers or correspondence.

Parris Haynes, a 22-year-old junior studying philosophy at Morgan State, said the structure and high demand of some students’ coursework almost “encourages or incentivizes” them to use AI to help get it all done.

He sees himself either going into law, or academia and said he’s a little nervous about how AI is changing those industries. Though he leans on AI to help organize thoughts or assignments for classes like chemistry, Haynes said he wouldn’t go near it when it comes to his work or career-related objectives for his philosophy classes.

“I don’t really see much of a space for AI to relieve me of the burden of any academic assignments or potential career tasks in regards to philosophy,” Haynes said. “Even if it could write a convincing human-seeming paper, a philosophical paper, it’s robbing me of the joy of doing it.”

Gen Z’s outlook on their future with AI 

Like Haynes, Fesinstine knows that some of her students are interested, but a little scared about the power AI may have over their futures. Although there’s a lot of research about how older generations’ jobs are impacted by AI, those just about to break into the workforce may be the most affected, because they’ve grown up with these technologies.

“I would say the attitude is — I use this term a lot, ‘cautiously curious,’” Fesinstine said.  “You know, there’s definitely a vibe around ethics and protection that I don’t know that I would see in other generations, perhaps … But there’s also an acknowledgement that this is something that a lot of companies are going to need and are going to want to use.”

Now, two years since ChatGPT’s release, Damico has started to realize the ways generative AI is useful in the workplace. She began working with PR firm Kronus Communications earlier this year, and was encouraged to explore some time-saving or brainstorming functions of generative AI.

She’s become a fan of having ChatGPT explain new business concepts to her, or to get it to suggest Instagram captions. She also likes to use it for more refined answers than Google might provide, such as if she’s searching for publications to pitch a client to.

Though she’s still cautious, and won’t use generative AI to write actual assignments for her, Damico said she realizes she needs the knowledge and experience after graduation — “it gives you kind of this edge.”

Boppana, who sees her career growing in the AI space, feels incredibly optimistic about the role AI will play in her future. She knows she’s more knowledgeable and prepared to go into an AI-centered workforce than most, but she feels like the opportunities for growth in healthcare, telecommunications, computing and more are worth wading into uncertain waters.

“I think it’s like a beautiful opportunity for people to learn how machines just interact with the human world, and how we can, I don’t know, make, like, prosthetic limbs, like test artificial hearts … find hearing aids,” Boppana said. “There’s so much beauty in the way that AI helps human beings. I think you just have to find your space within it.”

GET THE MORNING HEADLINES.

Biden administration leaves ‘foundational’ tech legacy, technologists say

Tech insiders say Biden is leaving a strong foundation for high-tech industry, boosting broadband access, setting a foundation for AI regulation, and encouraging chip manufacturing. (Rebecca Noble | Getty Images)

As he’s poised to leave office in two months, President Joe Biden will leave a legacy of “proactive,” “nuanced” and “effective” tech policy strategy behind him, technologists across different sectors told States Newsroom.

Biden’s term was bookended by major issues in the tech world. When he took office in early 2021, he was faced with an economy and workforce that was struggling to deal with the COVID-19 pandemic, and longstanding issues with a digital divide across the country. As he prepares to exit the White House, federal agencies are working to incorporate the principles from the 2023 AI Bill of Rights, on evolving technologies that will undoubtedly continue changing American life.

Though he was unable to get federal regulations on AI passed through Congress, Biden’s goal was to bring tech access to all Americans, while safeguarding against potential harms, the technologists said.

“I think everything that he does is foundational,” said Suriel Arellano, a longtime consultant and author on digital transformation who’s based in Los Angeles. “So it definitely sets the stage for long term innovation and regulation.”

The digital divide 

For Arellano, Biden’s attempt to bring internet access to all families stands out as a lasting piece of the president’s legacy. Broadband internet for work, healthcare and education was a part of Biden’s 2021 Bipartisan Infrastructure Deal, especially targeting people in rural areas.

Biden earmarked $65 billion toward the project, which was dolled out to states and federal departments to establish or improve the physical infrastructure to support internet access. As of September, more than 2.4 million previously unserved homes and businesses have been connected to the internet, and $50 billion has been given to grant programs that support these goals across the states.

Arellano said he thinks there’s still work to do with the physical broadband infrastructure before that promise is realized — “I think that should have come first,” he said.

“But I think as a legacy, I think breaching the digital divide is actually one of the strong — maybe not the strongest, but I would say it’s definitely a strong legacy that he leaves,” Arellano said.

Shaping the U.S. conversation about AI

During Biden’s presidency, practical and responsible application of artificial intelligence became a major part of the tech conversation. The 2023 AI Bill of Rights created the White House AI Council, the creation of a framework for federal agencies to follow relating to privacy protection and a list of guidelines for securing AI workers, for navigating the effects on the labor market and for ensuring equity in AI use, among others.

The guidelines put forth by the administration are subtle, and “not likely to be felt by the average consumer,” said Austin-based Alex Shahrestani, an attorney and managing partner at Promise Legal, which specializes in tech and regulatory policy.

“It was something that’s very light touch and essentially sets up the groundwork to introduce a regulatory framework for AI providers without it being something that they’re really going to push back on,” Shahrestani said.

In recent months, some federal agencies have released their guidelines called for by the AI Bill of Rights, including the Department of Labor, and The Office of Management and Budget, which outlines how the government will go about “responsible acquisition” of AI. It may not seem like these guidelines would affect the average consumer, Shahrestani said, but government contractors are likely to be larger companies that already have a significant commercial footprint.

“It sets up these companies to then follow these procedures in other contexts, so whether that’s B2B or direct-to-consumer applications, that’s like more of a trickle down sort of approach,” he said.

Sheena Franklin, D.C.-based founder of K’ept Health and previously a lobbyist, said Biden emphasized the ethical use and development of AI, and set a tone of fostering public trust and preventing harm with the AI Bill of Rights.

Franklin and Shahrestani agreed it’s possible that President-elect Donald Trump could repeal some of Biden’s executive orders on AI, but they see the Bill of Rights as a fairly light approach to regulating it.

“It was a really nuanced and effective approach,” Shahrestani said. “There’s some inertia building, right? Like a snowball rolling down the hill. We’re early days for the snowball, but it just got started and it will only grow to be a bigger one.”

The CHIPS act

Biden’s CHIPS and Science Act of 2022, which aimed to strengthen domestic semiconductor manufacturing, supply chains and the innovation economy with a $53 billion investment, is a major piece of his legacy, Franklin said. The bill centered on worker and community investments, and prioritized small businesses and underrepresented communities, with a goal of economic growth in the U.S., and especially in communities that needed support.

Two years after the bill was signed, the federal government, in partnership with American companies, has provided funding for semiconductor manufacturing projects that created more than 100,000 jobs and workforce development programs. The U.S. is on track to produce 30% of the world’s semiconductor chips in 2032, up from 10% today.

“He was really trying to position the U.S. as a global leader when it came to technology, because that industry is going to continue to grow,” Franklin said.

It’s hard to quantify what the lasting impact of the CHIPS act will be, but one immediate factor is computing, Shahrestani said. The AI models being developed right now have infinite abilities, he said, but the computing power had previously held the industry back.

“Being able to provide more compute through better chips, and more sophisticated hardware is going to be a big part of what provides, and what is behind the best AI technologies,” Shahrestani said.

Accountability for Big Tech

Many in the Big Tech community see Biden’s AI Bill of Rights, and its data privacy inclusions, as well as the Justice Department’s monopoly lawsuits against tech giants like Apple and Google, as hampering innovation.

Arellano is optimistic about the technological advances and innovation that the U.S. may see under a less regulation-focused Trump presidency, but he cautions that some regulations may be needed for privacy protections.

“My concern is always on the public side, you know, putting the dog on a leash, and making sure that our regulations are there in place to protect the people,” he said.

Franklin predicts that if Biden attempts any last-minute tech policy before he leaves office, it will probably be to pursue further antitrust cases. It would align with his goal of fostering competition between startups and small businesses and reinforce his legacy of safeguarding consumer interests, she said.

When she considered how to describe Biden’s tech legacy, Franklin said she nearly used the word “strength,” though she said he ultimately could have done a little bit more for tech regulation. But she landed on two words: “thoughtful and proactive.”

“Meaning, he’s thinking about everybody’s concerns,” Franklin said. “Not just thinking about the Big Tech and not just thinking about the consumers, right? Like there has to be a balance there.”

GET THE MORNING HEADLINES.

Some in the venture capital community backed Trump. Here’s what’s next

Elon Musk and Donald Trump

Tesla owner Elon Musk, right, was hardly alone in the tech sector in supporting the reelection efforts by Donald Trump, left. Many Silicon Valley investors and innovators were hoping for a lighter regulatory hand than they have seen under President Joe Biden. (Photo by Brandon Bell/Getty Images)

Some venture capital investors, who have funded the tech boom in Silicon Valley and beyond, say they are excited by the prospect of a lighter regulatory environment under a new Trump Administration than they saw under President Joe Biden.

But they warn that Trump policies that will benefit many technology companies may come at a cost to other pro-Trump voters.

The Bay Area bubble of Silicon Valley, which is home to institutional tech giants like Apple, Google, Intel and Adobe, had been previously seen as a left-leaning region, like many other California communities. But the 2024 election was a unique one, venture capitalists and founders say.

“There’s been a significant shift in the valley rightward since the last election,” said Joe Endoso, a Silicon Valley investor.  “And you’ve seen that in the financial flows — in the level of dollars — that were directed towards supporting President Trump’s campaign from the technology sector.”

Endoso, president of financial tech platform Linqto, said some tech industry people who previously voted for progressive issues and candidates this time cast their ballot for Trump. He said he’s heard more concern about potential regulations in the tech industry and negative economic effects under continued Democratic leadership.

This turn toward Trump wasn’t universal in the Valley. The majority of donations from employees at companies like Google, Amazon and Microsoft went toward Democratic candidate Kamala Harris, Reuters reported in September. But tech billionaires like Elon Musk and venture capital investors, like Andreessen Horowitz co-founders Marc Andreessen and Ben Horowitz, poured millions into his campaign.

While Trump didn’t receive unanimous support from the tech sector, many American tech giants and investors are excited about the light-handed approach to tech regulation that’s likely to come in the next four years. Congress has struggled to pass any federal laws around emerging technology like AI, though states have done so on their own on issues like data privacy, transparency, discrimination, and on how AI-generated images can be used.

The Biden administration, however, on its own issued a number of “best practice” guides for emerging technologies and aggressively pursued antitrust cases against some tech giants, including an ongoing case against Google that could force the company to spin off its popular Chrome web browser.

It appears unlikely that Trump will continue the Biden era regulatory and enforcement drives.

Those working in emerging technologies like AI are making advancements so quickly that regulators are unlikely to be able to keep up anyway, Endoso said. The tech industry mindset — move fast and break things, first coined by Facebook founder Mark Zuckerberg — will likely continue under Trump’s administration.

“You’re running through walls and hoping that when the regulations come about, they’re not going to be so, you know, restrictive,” Endoso said. “But you’re not going to sit and wait for the regulators. You can’t afford to.”

Why care about the VC market?

Venture capitalists pour money into many promising startups in Silicon Valley and elsewhere, looking for the ones that will create lucrative new technologies or “disrupt” existing ones. Silicon Valley successes include Uber, which received its first round of venture capital investment for just about $1.3 million in 2010, and Airbnb, which started with just a $20,000 investment in 2008. Today, the companies are worth $146 billion and $84 billion, respectively.

Many more, however, fail. High-visibility startups that folded after raising very large sums include streaming platform Quibi, which raised $1.75 billion and ChaCha, the SMS text-based search platform that had raised $108 million.

The high-risk, high-reward nature of the industry makes for a rarified business, and there’s a high barrier to entry. To become an accredited venture capital investor, one must have an income of at least $200,000 a year, or be worth $1 million. The handful of firms pouring the most money into the United States technology market are usually worth billions.

Yet, the technology being developed and funded by wealthy investors today will shape the next decade of everyone’s lives. Some of the most influential technology in the global economy has been released under President Joe Biden’s administration in the last three and a half years.

Advancements in generative AI and machine learning technology, rapid development of augmented and virtual reality, further adoption of cloud computing and Internet of Things (IoT) technologies, such as internet connected appliances and home devices, along with automation of many industries have already shifted much of American life. ChatGPT, one of the most recognizable examples of generative AI that the public can use, was only released two years ago, but the sector of generative AI is already threatening many American jobs.

Those with writing-focused careers like copywriters and social media marketers, are already feeling the disruption, and experts believe STEM professionals, educators and workforce trainers and others in creative and arts fields are going to see much of their job responsibilities automated by AI by 2030. 

The venture capital market has been a volatile one over the last four years. Though many of Trump’s attacks on Democrats during his campaign cycle centered on the healthy economy under his first term, the COVID-19 pandemic was the single-biggest economic factor to disrupt the venture capital market and others.

The U.S. saw its biggest year for venture capital investments in 2021, but supply-chain issues and the continuing reliance on remote work changed the trajectory of many companies’ plans to go public on the stock market. High inflation and interest rates have kept many investors from deploying capital and many companies from completing mergers and acquisitions since then, although the second half of 2024 is looking up.

The economy quickly became the number one issue for Americans in the presidential election cycle. And though thriving venture capital markets usually benefit those that are already wealthy enough to invest, we’ll likely see a positive correlation in the general markets too, said Scott Nissenbaum, president and CEO of Ben Franklin Technology Partners, an innovation-centered fund in Pennsylvania.

“A thriving, efficient market is good for venture capital. And the flip side is also true,” he said. “We feed into and create the innovations and the efficiencies and the next generation … that create the robust and the boom.”

How investors and founders are preparing for Trump 

Nissenbaum predicts that Trump may remove regulations for technology used by U.S. transportation and military systems, allowing for more tech integration than previously permitted without human safeguards in place. That might look like more flight optimization technology, or more drone usage by military branches. Nissenbaum also thinks Trump will attempt to open up space travel, especially with big backing by Musk, who runs SpaceX.

Health care also has been implementing technology rapidly, and Nissenbaum believes could see some major changes under Trump.

That is of note for healthtech founder Sipra Laddha, an Atlanta-based psychiatrist and cofounder of LunaJoy, which provides in-person and virtual wellness visits for women. The three-year-old company raised venture capital in 2022 and 2023, despite a more challenging fundraising market. Women’s health care companies saw a surge of VC investment in the wake of the overturning of Roe v. Wade in June 2022, an exception to the generally slower investment market at the time.

But she is uncertain about how Trump’s potential cabinet appointees, like Robert F. Kennedy Jr., who was appointed to head the Department of Health and Human Services, will affect LunaJoy’s operation. Kennedy has made health a key issue in his public advocacy and political activity, but he has also espoused eccentric and even false views on issues such as vaccines and pharmaceuticals.

“When women don’t have choices, mental health is significantly worse, and that’s something that goes on, often, for the entire time of that family’s trajectory,”  Laddha said. “So I’m not quite sure what’s going to happen, but you know, those are certainly things that, as a women’s mental health company, we are looking at and watching closely to see what sort of legislation, rules and laws come out.”

When it comes to fundraising early next year, Laddha is optimistic. She’s focused on how fragmented the healthcare industry is right now, and plans to showcase how companies like hers will aim to integrate with larger health systems.

“Our role is to be really as disruptive as possible, and to bring to the forefront the most innovative solutions that we can do while still working within the current framework of healthcare that exists today,” she said.

Some sectors worry about Trump economic policy

While software and cloud-based technologists seem excited by the effects of deregulation, startup founders that make physical products, especially using microchip technology, are wary of Trump’s plan to impose tariffs on imported goods.

Samyr Laine, a managing partner at Los Angeles-based Freedom Trail Capital, specializes in consumer tech and consumer packaged goods. Laine said he feels a sense of relief in ending the “uncertainty” around who will take the presidency the next four years, but he predicts many founders will feel the costly effects of Trump’s planned tariffs, and pass those additional costs to consumers.

Though the existing companies in his portfolio won’t be hit too hard, it’s a factor they’ll be forced to review when considering investments in companies in the future. Those that will incur the additional costs of imported goods will have to adjust their profit margins and might not be as attractive to investors.

“As a consumer and someone who isn’t in the space, not to be like a fear monger, but expect that some of the things you typically pay for, the price will go up,” Laine said.

The effect on work

Although Trump was successful in picking up a significant amount of tech industry elite support this election season, much of his voter base is working class people who will not feel the positive effects of tech industry deregulation.

Endoso, the Silicon Valley investor and founder, says the Trump coalition of tech entrepreneurs and working-class voters represents “a division between the haves and the have-nots.” The usual basis on which people pick their electoral preferences, like race, geography, income and proximity to city life, were “shattered” this time around.

“It was a revolt of the working class, at least in my view,” he said.

The advancements of AI and machine learning, which will enrich the investor class, will have large implications on employment for those working class voters. The vast majority of Americans who are not college educated, and work physical jobs, might struggle to thrive, he said. We’ll likely see overhauls of industries as robots replace and automate a majority of physical labor in warehouses, and self-driving vehicles take over jobs like long-haul trucking and ride services such as Uber and Lyft.

“I think those are important questions to be asking from a policy standpoint, and I think that the intelligent answers shouldn’t be ‘let’s shut the innovation down.’” Endoso said. “That didn’t work in 19th century England. It won’t work here today, right? But it does require our rethinking the definition of work, and the definition of how you … organize a society along lines where you don’t need to have the same level of maybe direct labor input as we had in the past.”

Nissenbaum agreed, saying that AI has already begun to leak into every field and industry, and will only continue to disrupt how we work. As revolutionary as the internet and internet companies were in the late 1990s, the web has become the infrastructure for artificial intelligence to become more efficient and effective at everything it does.

With lighter regulation under a new Trump administration, we’re likely to see AI develop at unpredictable rates, he said. And laborers will definitely be feeling the effects over the next four years.

“You’re not going to lose your job to AI,” Nissenbaum said. “You’re going to lose your job to someone who understands how to do your job with AI.”

GET THE MORNING HEADLINES.

❌