The Hidden Force Keeping Women from Using AI
Episode 546 | Author: Emilie Aries
How attribution bias makes AI adoption uniquely challenging for women in the workplace.
There’s no question that women are more reluctant to embrace AI than men and that this discrepancy is creating a whole new kind of workforce gap. In Episode 540, The Double Disadvantage: AI, Women, and the Future of Work, I referred to a 2024 Harvard Business School study that indicated women adopt AI tools at a 25% lower rate than men. One big reason for this reticence, they found, is a belief that using AI is cheating. New research from Lean In proves that this assumption is entirely founded.
A key gender disparity in AI acceptance
In April, Lean In released the results from a survey of more than 1,300 developers across 61 countries. It turns out that 64% of female developers find AI helps unlock creative thinking and problem-solving, enhancing their productivity and even advancing their careers. So Lean In dug deeper. Their nuanced questions reveal that the problem isn’t women’s ability to learn AI or their access to the programs. It’s the environment in which these tools are being used that is preventing more widespread adoption.
Attribution bias is a small part of the Lean In study, but it’s an essential one. An attribution bias is “a pattern around how we assign credit and how we assign blame,” and whether you’ve heard the term before or not, you’ve no doubt seen it in action.
Decades of research have made it clear that society tends to attribute certain outcomes differently to men than to women. Specifically, we’re likely to credit a man’s success to skill or intelligence, while a woman’s success is more likely to be attributed to luck or community support. Likewise, a man’s failure is more likely to be blamed on bad luck or an impossible situation, while, when a woman fails, we tend to assume she didn’t work hard enough. We’re even guilty of looking at our own successes and failures this way.
That knowledge is far from new, but what the Lean In research clarifies is the fresh face of this age-old bias in the burgeoning AI era. For instance, Lean In’s research shows men are almost 30 percent more likely than women to be praised by their managers for incorporating AI into their workflow. In the best-case scenario (which is still not great), that means women are simply not being recognized at all for this skill. But the worst case is much worse. An August 2025 study published by Harvard Business Review on the competence penalty highlights this perfectly.
The uneven assumption of competence
The study involved nearly 30,000 software engineers at a tech company. Every engineer was given the same Python code to review. Half were told the code was written entirely by a human, while the other half believed it was created with AI assistance.
When a reviewer believed AI was involved, on average, they rated the engineer’s competence 9% lower. In other words, there’s a competence penalty placed on people who use AI to complete their work. But, as you might have already guessed, that penalty wasn’t equal between men and women. If the creator of the code was believed to be male, perceived competence dropped an average of just 6%, while code thought to be written by a female engineer using AI saw a 13% drop on average.
Believing you’re being penalized for using AI at work isn’t victim mentality, or a misunderstanding, or all in your head. Rather, this research suggests it’s an awareness grounded in a worrisome reality. This isn’t a skills gap; it’s a penalty gap. For as long as women have worked, our work has been scrutinized more harshly. AI hasn’t changed that; it’s just shifted the focus a bit.
We’re judging ourselves, too
Many of the engineers in the above HBR study acknowledged that they actively avoid AI to protect their reputations. This vein of research certainly shows that not adopting AI or keeping your use private can be a protection mechanism. But you can’t lie to yourself, and that’s another piece of this puzzle.
The idea that we aren’t successful if we don't slog through every step along the path to our success is a deep-rooted perception that has real repercussions, like burnout. In my book, I call it the martyrdom mindset: if you don’t struggle, if you use technology to make your life easier, you don’t deserve to win. If that sounds like you (and I’m guessing you probably feel a bit called out right now), it makes sense that you’ll resist incorporating AI into your work.
Given all these external and internal factors, it feels like the cards are stacked against us. It certainly removes any question mark around why women are resistant to building the AI fluency that’s becoming essential to career advancement, never mind beneficial to productivity and, therefore, stress reduction.
How can women authentically approach AI use at work?
Despite all this research and the valid concerns you might have, I do stand by my recommendations in other episodes to start experimenting with AI in your workflow, but we can’t just ignore the potential impacts of that adoption. We need to raise awareness among our managers and leaders about this unconscious bias. Here’s what you can try, to start pushing the needle in the right direction.
Name the external bias. Talking about something can remove its power. When you see evidence of attribution bias, call it out. Bring it up with your boss, coworker, or friend who’s downplaying someone’s skill or success just because they used AI, especially if their reaction differs based on gender.
Name the internal bias. Next time you feel guilty for “taking a shortcut” or “being lazy” because you used an AI tool, ask yourself whether you’d feel that way if you learned a male colleague or anyone else had used a tool this way.
Make your AI wins visible. This one comes highly recommended in the Lean In report. When AI assistance saves you hours on a report or helps you prep for a client meeting where you shine, talk about it. Every time you share these experiences in a way that uplifts your success and acknowledges your efficiency hack, you’re helping yourself get credit and inspiring others to share as well.
AI familiarity is becoming a critical skill set. So start using the tools (within reason, of course; your AI chatbot’s verbatim outputs should never serve as your final draft), and keep thinking about the story being told by your own brain, your leaders, and your coworkers. Using an LLM tool isn’t “cutting corners” any more than using a spellchecker is. You’re not cheating; you’re being the boss of your career.
In the spirit of these tips, I want to acknowledge that this episode was developed with the help of AI. You’re hearing my real voice, and this blog post is human-written, but these tools have been useful thought partners for the planning stage, when I’m putting the episodes together. I hope I’m still creating value for you, and I’m definitely feeling the efficiency benefits. I’ll say it here, for myself and for the people in the back: I shouldn’t feel bad about saving time with a new tool or about acknowledging it,and neither should you!
What do you think? How are you seeing attribution bias playing out in terms of workplace AI, and how is it affecting your willingness to adopt it? Email me your thoughts, as always, or connect with the Facebook Courage Community or our group on LinkedIn to share and get inspired.
Related Links From Today’s Episode:
LinkedIn Learning Course, “Get Unstuck: Make a Plan to Move Your Career Forward”
Lean In, New research: Women use AI less often at work and get less credit
Episode 540, The Double Disadvantage: AI, Women, and the Future of Work
Episode 543, Why Your Resume Isn’t Working (and What to Do Instead)
Bossed Up: A Grown Woman's Guide to Getting Your Sh*t Together
Utilize AI to help you get unstuck at work:
-
[CONFIDENT RHYTHMIC DRIVING THEME MUSIC WITH DRUMS STARTS]
EMILIE: Hey, and welcome to the Bossed Up podcast, episode 546. I'm your host, Emilie Aries, the Founder and CEO of Bossed Up. And today we're talking about the hidden force, one of, I should say the hidden forces that's keeping women from really adopting and using AI.
[MUSIC FADES AND ENDS]
I want to dig into this topic further after we started talking about women, AI and what I'm calling the double disadvantage over the past few weeks and months on the podcast here, because as we know, as we've talked about before, this is a tidal wave that's coming to transform the labor market. And if women are not at the table when it comes to redesigning our workforce in the age of AI, we are very likely to be left behind. That said, I also want to be really clear that women have very well founded concerns about AI.
So a lot of the hesitancy around adoption comes from a highly perceptive place, comes from a skepticism that is well established, that is well earned, frankly, by an industry, AI industry that has not earned our respect, that has not earned our trust, I should say. And so if you have a raised eyebrow, if you spiritually have raised your eyebrow at the whole conversation around AI, I get it and it makes sense.
And there's some new research that just came out three days prior to this recording, probably about 15 days prior to you listening to this. So just a couple of weeks ago, Lean In came out with some interesting new research around women in AI. And I went to dig into a few of the stats that I found most surprising, because one of them was that when women do adopt AI, they find the benefits to be worthwhile. Specifically, they surveyed over 1,300 developers across 61 countries and they found that 64% of female developers say that AI is actually accelerating their careers.
So women who are adopting and using AI, even in the tech industry itself, have found it to be worthwhile, have found it to unlock more creative problem solving, more creative thinking, actually helping them accelerate not only their productivity, but they specifically named their careers. And so we know that the tools themselves can be powerful in accelerating your future. And we know that women themselves aren't the problem.
But Lean In dug in with some really smart and nuanced questions that I was happy to see somebody asking. And what they found that the problems emerge when it comes to women in AI as it relates to their environment. The problem is really what happens around women when they use AI and when they don't. That's what I want to break down and get into the details of today, it's called attribution bias. And this just popped up in one question in the entire survey. And the entire article that Lean in discusses their new research. But I think it is a key point that we need to dig into.
So first, let's talk about what attribution bias actually is. I think some of us have heard the term before, but we might not know exactly what it means. So let's break it down. Attribution bias, at its core, is a pattern around how we assign credit and how we assign blame. It's this tendency, and unconscious, I should add, tendency for the same behaviors and the same outcomes to be attributed differently based on identity. So in this case, it's very clear in the research that we attribute certain outcomes to men differently than we attribute them to women.
When a man succeeds at something, people tend to attribute it to his skill, his intelligence, or his competence. When a woman succeeds at the same thing, people are more likely to attribute their outcome to luck or to the situation or to the community support she received along the way. And women, ourselves are more likely to attribute our success to a collective than to ourselves as well. And the same is unfortunately true for failure as well. The reverse is seen in the research. When a man fails, we tend to attribute it to bad luck. Being in the wrong place at the wrong time, having inherited an impossible situation. When a woman fails, it must be because she wasn't good enough, right? We're much more likely to attribute failure intrinsically to women's innate traits than we are to men.
Now, this isn't new research. Social psychologists have been documenting these double standards for decades. But what is interesting and new is thinking about how attribution bias shows up in the conversation around AI. Because it's not good. That new research from Lean In that just came out, found that men are 27% more likely to be praised by their managers for using AI at work. Same tools, same workplace, but men are getting recognized and rewarded for experimenting with AI at significantly higher rates than women are. That's attribution bias in action. Like when a man uses AI to streamline his process, he's innovative. When a woman does the same thing, it's more likely to go unnoticed or, or worse. And the fact of the matter is that it does get a lot worse.
A landmark study published in the Harvard Business Review last August, which came from researchers at King's College, London, Peking University, and Hong Kong's Polytechnic University, documented something that they call the competence penalty. Their research surveyed over 28,000 software engineers across a major tech company. Every engineer reviewed the exact same Python code. The only thing that varied was whether they were told that the code was written with or without AI assistance. Now here's what happened. When reviewers believed that AI was involved, they rated the engineer's competence 9 percent lower, even though the code itself was identical. So people were essentially penalized, or their perception of competence was penalized for admitting to using AI to achieve the same outcome, right? That's bad.
But here's the part that really stopped me in my tracks. That penalty was not distributed equally. Male engineers who were perceived to have used AI saw a 6 percent drop in perceived competence. But female engineers in the same scenario experienced a 13 percentage point drop in their perceived perceived competence. So that brings me right back to one, one thing that we talked about in the past couple of episodes on women in AI at work, which I'll drop links to in the show notes below if you've missed them. But one of the big reasons why women say they don't use AI is because they perceive it as cheating. And our perception is grounded in an acute awareness of how other people perceive us.
Because in this experiment it became very clear that women using AI more negatively impacts our perceived competence than our male counterparts who are using AI or perceived to be using AI. And that's fascinating because to be clear, the Python code that was being reviewed was the same. It was all about perception. And this negative perception was particularly exacerbated amongst people who don't use AI, who haven't adopted AI. In this study it showed up in this way, the male engineers who did not use AI themselves penalized women who did use AI 26 percent more harshly than they penalized men who did the same thing.
So if you're working for someone who is old school about AI, who is anti AI, and you're a woman working for them, you might have very good reason for keeping your use of AI to yourself, for that being actually genuinely perceived as cheating. Not just in your own head, right? This isn't just like a imposter syndrome conversation. This is grounded in our acute social awareness of perception management. I'll drop a link to that HBR study in the show notes. It's called The Hidden Penalty Of Using AI At Work.
So here's where attribution bias meets with what researchers call the competence penalty. Attribution bias means women get less credit for, for the same contributions for having the same impact. And the competence penalty means women are actively perceived as less capable for using the same tools. So when you stack those kinds of bias and unconscious bias on top of one another, you start to understand, okay, women aren't using or aren't adopting AI at the same rates that men are. Not because it's a skills gap, it's a penalty gap. Like women's work has always been scrutinized more harshly than men's, and now that scrutiny is extending to how women are using or adopting or not adopting AI.
This has a chilling effect in more ways than just one. It's not just about our social perception. Now, this whole combination of attribution bias and the competence penalty, it doesn't just affect how we women are perceived, it affects how we behave. The researchers behind that study ran follow up surveys with over 900 engineers, and they found that many were actively avoiding AI to protect their reputations. The groups most vulnerable to the competence penalty, women and older workers, were the least likely to adopt AI despite potentially benefiting the most from the efficiency gains available there.
The researchers called it a paradox. But honestly, I think it's logical self protection, right? When you know consciously or unconsciously that you're less likely to be praised and more likely to be judged for using a tool, of course you're less likely to use it. And the Lean In data, it backs this point up too. Women are 32 percent more likely than men to worry they'll be perceived as cheating when they use AI. That's not coming from nowhere, folks. All right? These are not baseless anxieties. They are accurate reads of an environment that treats women's AI use fundamentally different than men's.
And here's the part I want to sit with for a moment because I think it's a piece that often gets missed. This doesn't just all boil down to how other people are perceiving you. It impacts how we credit ourselves, how we attribute our own successes to internal versus external factors. Think about it. If you grew up being told, either directly or indirectly, that your value comes from how hard you work, that you have to be twice as good to get half as far, that you have to earn every single thing you get, you have to suffer your way to success. That you have what I call in my book the martyrdom mindset, that it has to be painful, it has to be hard, it has to be involve suffering for it to be valuable, then using a tool like AI that can make your job exponentially easier. Of course that feels like you're cheating, not because anyone said that to you, but because somewhere deep down you've absorbed this idea that if you're not struggling, you're not deserving.
And for me, this really boils down to self-worth and our relationship to money and value creation. If you believe that you have to work hard to create a lot of value, then adopting AI is going to be really hard for you. And I can speak to that personally. As someone who grew up with, like, hard work as my central core value of a huge part of my identity. My parents have always worked hard, my in-laws have always worked really hard. A lot of people who I love and respect in my life have always worked hard and pride themselves in hard work. I myself feel better after a hard day's work than a day of relaxing. Like, I enjoy labor in some ways. And yet incorporating AI has really been a part of my overall mindset shift around money, and around labor, and around value creation.
In fact, I've been working on my own relationship to money for years now around this belief that, like, actually it doesn't have to be hard. Like, what if I can create a lot of value for people, have a really positive impact, have some wonderful, productive, impactful outcomes without it feeling hard. With it feeling easy, while finding my flow with money, flowing my way with money, just like finding me. And that is a very big mindset shift away from, you put in a hard day's work, you collect your paycheck at the end of two weeks and you do it all over again. Like, that is very disruptive to that entire way of being in a feeling of value to people.
And so that inner voice that says to you, your wins only count if they cost you something, that's in some ways attribution bias turned towards you, like turned inward. And that can be just as powerful a barrier to AI adoption as any external penalty. Because you don't even need a manager to undervalue your AI assisted work, you're already doing that to yourself. So when we see that stat about women being 32 percent more likely to feel like using AI is cheating, I don't think it's just about fear of how others will perceive us. I think for a lot of us, it reflects a genuine, deeply held belief that making work easier somehow cheapens it. And that is a huge philosophical barrier to the revolution that I think is coming our way.
So this kind of puts us in a spot where we have this vicious cycle kind of operating on two levels, the external and the internal. Externally, women get less encouragement to use AI, less credit when they do, and more scrutiny and reputational risk when people do find out that you're using AI. And internally, women are less likely to credit themselves. When we're experiencing AI assisted wins at work, we're more likely to feel like the work doesn't count and more likely to opt out to preserve our sense of having, like, earned that success.
Both of these cycles, internal and external, they feed each other and they keep women from building the AI fluency that's rapidly becoming more essential for career advancement. Sheryl Sandberg herself put it well when she said that these small gaps can become really big chasms over time if we don't call attention to them early. And she's right. In a fast moving landscape like the economy and the economic evolution that we're living through right now, those early gaps can expand fast.
Now, if you listen to my prior episodes on this, which are very torn, right? Philosophically about women's adoption on AI, you might be thinking that, look, I already told you to start experimenting with AI at work. And yeah, I stand by that. But I want to be honest about the shortcomings of that advice. Telling women to just start using AI without addressing the external environmental impacts, like the environment, the workplace that they're using it in. That's just half the story. Because if the workplace we are in is going to punish you differently for using the same tools as your male colleagues, especially the same tools your male colleagues are being praised for using, then the issue isn't your adoption rate, it's the system around you. So we have to raise awareness among managers, among leaders, about the unconscious ways that bias intercepts our perception of who's using AI and who gets credit. Who gets praised for using AI versus who gets scrutinized for using AI.
There's a double piece of the education here. We have to convince women that AI adoption is worthwhile and can be helpful. But we also have to convince the people around women who are judging women's performance that women using AI is not cheating and that men using AI is not necessarily worthy of praise like, it is a tool. And so bringing a little more neutrality to that perception is the second half of the conversation that we have to have about this.
So what can you do with this information? I want to leave you with this, first and foremost, name the external bias. When you talk about attribution bias, you can actually remove its power a little bit. So once you understand what it is in your team and your own performance reviews and how your manager talks about who's innovative and who's cutting corners, it can become a lot harder for it to operate in the dark and operate unchallenged. So if you're a manager listening to this, I want you to ask yourself, honestly, am I praising men and women equally for experimenting with new tools? Am I encouraging my whole team to build their AI skills or just the people who already seem more tech-oriented?
And second, I want you to name that internal bias too. This one's a lot harder to do. But the next time that you use AI and catch yourself thinking, uh, am I just, like, taking a shortcut? Is the work product worse? Like, I want you to exercise discernment, right? The AI chatbot is not the final draft. Like, the AI produced work is not your final product. Like, discernment is good, but that doesn't mean self-criticism is warranted. That doesn't mean you have to beat yourself up emotionally for cutting corners when you're really just leveraging technology effectively.
One way that might be helpful is just to ask yourself, would I criticize a colleague or a male colleague in particular for using the same methods and tools that I just used for doing this work? And if the answer is no, then you gotta cut yourself slack and give yourself credit for the result, regardless of any AI assistance you may have received along the way.
And finally, I want to challenge you, even if it feels hard to do, to make your AI wins visible. This is one of the big things that the Lean In report recommends, and I think it's actually crucial. So if you saved hours on writing a report by using AI, or you caught an error by leveraging AI technology, or you used it to help you prep for a client meeting more efficiently, you should be sharing those wins with your leadership. You should be writing that into your self-evaluation. Because if attribution bias means your AI contributions are less likely to be noticed, you need to make them impossible to miss. And every time you share those little case studies of how you're leveraging AI and making it work for you, you're also inspiring those around you to do the same, right? That kind of courage is contagious and it's inspiring. And in doing so, you can help normalize the practice at your place of work.
Just be clear with your team, with your company, with your manager on what the parameters are, what the guidelines are, what the boundaries are when it comes to your workplace's use of AI and follow those boundaries. But for everything within those parameters, I would share early and often your wins, because they can help you get credit where credit is due and help you inspire others along the way.
Bottom line, attribution bias is not new. This is the same force that's at play behind things like the confidence gap or the leadership gap or the pay gap. It just, it has a new arena in the world of AI where it's showing up again. And the stakes in this arena are particularly high because AI fluency is quickly becoming a critical skill set for career advancement. So, yes, get in there, use the tools, but also keep your eyes open about the environment in which you are operating, about the story that you're telling yourself in your own head about your use of AI, and whether that is a good or a bad thing. Because it's the voice in your own head that tells you, you only are worthy of success if it was difficult to get there. And that is worth pushing back on.
Using every tool at your disposal to do your best work. That is not cutting corners, that is being the boss of your career. And I will confess the episode you are listening to and many of them recently, I created this with the assistance of AI, right? I'm leveraging AI at Bossed Up in lots of innovative ways, not only because I find it to be really helpful in creating efficiencies in our systems here, but because I actually find it to be a very helpful thought partner when it comes to putting together episodes like this one for you. And so my hope is that I'm still creating value for you, that I'm still creating important conversations and hosting important conversations here on the podcast with you, but that it is taking me less time. And that is not a bad thing. And it's not something I should feel bad about admitting or acknowledging. Like, that is the world we live in. And I'm experimenting.
So I would love and welcome your feedback along the way. I want your feedback on all the recent episodes I've done around AI women, the double disadvantage. And if you want to start leveraging AI tools to help advance your career. I get into that in much more detail in my recently released LinkedIn Learning course, Get Unstuck: Make A Plan To Move Your Career Forward, which I'll link to in today's show. Notes that course itself includes, incorporates and embeds a lot of sophisticated large language models and AI based chatbot interaction in the LinkedIn learning platform itself. And I think they're doing some of the best work out there at the intersection of AI and career advancement. So I highly recommend you check it out.
In the meantime, as always, let's keep the conversation going after the episode in the Bossed Up Courage Community on Facebook and in the Bossed Up Group on LinkedIn, and I want to hear from you. So what do you make of attribution bias and the competence penalty? And how is that intersecting with your adoption of AI tools?
[CONFIDENT RHYTHMIC DRIVING THEME MUSIC WITH DRUMS STARTS]
As always, I want to hear from you. And my inbox is always open at Emilie@bossedup.org and until next time, let's keep bossin’ in pursuit of our purpose, and together let's lift as we climb.
[MUSIC FADES AND ENDS]