SHIFT's eLearning Blog

Our blog provides the best practices, tips, and inspiration for corporate training, instructional design, eLearning and mLearning.

To visit the Spanish blog, click here
    All Posts
    Why Instructional Designers Can’t Afford “Good Enough” AI Anymore
    10:16

    Why Instructional Designers Can’t Afford “Good Enough” AI Anymore

    Why the AI tools your team uses — and where they live in your workflow — is becoming the new strategic question for L&D.

    The bar for what counts as "good" learning content has moved.

    Your learners spend their evenings watching cinema-grade visuals on their phones, listening to AI voices that are nearly indistinguishable from humans, and scrolling through infographics that explain complex topics in 30 seconds. Then they log into a corporate course and get clipart from 2014 with a robotic voiceover.

    The disconnect isn't subtle, and learners feel it. LinkedIn's Workplace Learning Reports consistently show that L&D teams are under pressure to connect learning to business outcomes, improve skills, and support retention — all of which depend on learning experiences people actually engage with.

    Here's the uncomfortable truth: AI didn't fix that problem. It just changed who has access to the solution.

    By 2026, many instructional designers can access tools capable of generating high-quality video, images, voiceover, and visual explainers — but output quality varies widely by tool, workflow, and governance. "Any tool" isn't the answer. The teams pulling ahead aren't the ones using more AI — they're the ones using the right AI, inside the workflow they already work in.

    This article is about why that matters, and what it really takes to create content learners actually want to engage with.

    The content quality bar has moved — permanently

    When generative AI broke into the mainstream in late 2022, the conversation was mostly, "Can it do this at all?" Three years later, the question is, "Which model is best for this specific job?" — and the answer is rarely the same for images, audio, video, and infographics.

    Images. The gap between a top-tier AI image and a generic one is the difference between a learner pausing to take it in and scrolling past it. Both technically deliver an image. Only one builds visual credibility for your training.

    Audio. Leading AI voice tools now produce highly lifelike, multilingual speech, making flat legacy text-to-speech feel increasingly dated. Learners hear these voices every day in the media they consume, and the bar for what sounds acceptable in eLearning has moved with them.

    Video. Today's AI video and avatar tools make it possible to produce scenario-based training in hours instead of months. Leading AI avatar tools can now support eye contact, gesture, and multilingual delivery. The result: video that scales with your audience, not against your budget.

    Infographics. Concepts that used to require a graphic designer and multiple rounds of revisions can, in many cases, now be drafted and iterated in hours rather than days — if the tool is good enough to produce something worth using.

    The common thread: each modality now has specialized tools that outperform generic AI for specific use cases. In a content landscape this competitive, the difference between the right specialized tool and a generic one shows up directly in whether learners engage or scroll past.

    For instructional designers, this raises a hard question: how do you pick the right tool for each job, keep up with a market that moves every month, and still actually build courses?

    Why instructional designers specifically need this

    The job description for an instructional designer has quietly expanded. Five years ago, an ID was a learning architect — write the script, build the storyboard, hand off to media production. Today, the same ID is expected to also be the writer, the designer, the voice director, the video producer, and increasingly, the data analyst.

    That's not sustainable without leverage.

    Speed has collapsed. Course development cycles that used to run in months now run in weeks. Stakeholders see what marketing teams ship on a weekly cadence with AI tools and ask, fairly, "Why does our compliance refresh take six months?"

    Cost pressure is real. Recent L&D research, including LinkedIn's Workplace Learning Report, shows that learning teams are being asked to connect programs more directly to business outcomes. In that environment, external agencies, stock libraries, and one-off freelancers are increasingly hard to justify when AI can produce comparable assets in-house.

    Creative range is the new requirement. A modern course isn't a slide deck. It's a mix of micro-video, scenarios, interactive media, audio narration, branching simulations, and assessments. No single human is naturally great at producing all of that — but with the right AI tools, one ID can.

    Personalization is finally scalable. Different regions, languages, audience segments, and proficiency levels each want their own version of training. Without AI, that level of variation is hard to operationalize. With AI, it becomes much more realistic.

    The implication: AI for IDs is no longer a productivity hack. It's the only realistic path to keeping up with what the organization is asking for. The IDs and teams who lean into it aren't replacing themselves — they're upgrading what an ID is capable of producing.

    Why organizations need this (not just their IDs)

    Zoom out from the individual contributor, and the case gets stronger.

    Brand consistency. When every ID is using their own personal mix of AI tools, your training catalog ends up looking like it was made by twelve different companies. Centralized, sanctioned tools fix that.

    IP and compliance safety. Free-tier and consumer AI tools can differ materially in their terms for data retention, model training, output ownership, and commercial use. Enterprise-grade AI tooling — with clear commercial-use rights and data controls — is a category requirement, not a nice-to-have. Legal and IT teams already know this. L&D leaders need to catch up.

    Vendor and license sprawl. Okta's 2025 Businesses at Work report found the average company uses around 101 apps, while Productiv's SaaS-sprawl research has reported much larger portfolios in enterprise environments. Every new AI tool adopted ad hoc by L&D adds procurement overhead, security review, billing, and onboarding cost. Consolidation isn't bureaucratic preference — it's economic reality.

    Time-to-competency. Faster, better training translates directly into faster onboarding, faster product rollouts, faster compliance turnaround. The ROI argument for AI in L&D isn't mainly about saving on production — it's about closing the gap between "new policy" and "workforce performing on it."

    The takeaway: AI tooling in L&D is now a strategic decision, owned at the leadership level — not a tactical experiment owned by individual IDs.

    The hidden productivity killer nobody talks about

    Here's the part most AI conversations miss. Even when you have access to the best tools, the way you access them determines whether they actually save time.

    Most L&D teams today are working around a stack that looks something like this:

    • One subscription for image generation
    • A separate one for video avatars
    • A separate one for AI voices
    • A separate one for infographics
    • A separate one for the script and storyboard
    • And then the authoring tool, which has none of the above built in

    Microsoft's Work Trend Index has documented how fragmented digital work creates constant interruptions, with workers receiving a ping roughly every two minutes during core work hours. It's not just lost minutes — it's lost focus, lost context, and lost quality.

    For an ID building a course, the daily reality is:

    • Generate an image in tool A
    • Download it
    • Re-upload it into the authoring tool
    • Realize you need a different aspect ratio
    • Go back to tool A
    • Repeat for every asset
    • Version-control none of it
    • Pray your teammate can find the file you used

    Add collaboration to this picture and it gets worse. A reviewer in another tool can't see how an asset was generated. A teammate can't fork your work. A manager can't enforce brand standards across what is effectively a dozen separate workflows.

    The lesson: best-in-class AI tools, accessed through fragmented workflows, deliver a fraction of their possible value. The integration is the product.

    Enter SHIFT AI Studio

    This is the gap SHIFT AI Studio was built to close.

    SHIFT AI Studio is a single creative environment — available directly inside SHIFT Meteora and accessible from your preferred LMS — that brings together the best AI engines for the four content modalities IDs actually need:

    • Images — photorealistic, illustrative, and brand-aligned.
    • Audio — multilingual, human-grade narration and voice.
    • Video — AI avatars and scene generation for scenario-based learning.
    • Infographics — complex concepts visualized in minutes, not days.

    Three things make it different.

    It curates the best engines, so you don't have to.

    We've done the evaluation work — which model is best for which job, which is enterprise-safe, which scales. IDs don't need to be AI researchers to use the best AI available.

    It lives where your work lives.

    SHIFT AI Studio is native to SHIFT Meteora's authoring environment. Generate an asset, drop it into your course, iterate in place. And because it's also accessible from your preferred LMS, your team isn't forced to migrate platforms to benefit.

    It's built for teams, not just individuals.

    Shared assets. Shared brand controls. Shared workflows. Enterprise-grade security, commercial-use rights, and predictable cost structures — so IT and Legal have a clearer path to approval.

    The pitch is simple: stop choosing between great AI tools and a great workflow. Now you don't have to.

    The bottom line

    The instructional designers and L&D teams that will define the next era of corporate learning aren't the ones with the longest AI subscription list. They're the ones who can move from idea to finished, on-brand, beautifully produced course faster than anyone else — without compromising quality.

    That's the bet behind SHIFT AI Studio.

    See SHIFT AI Studio in action →

    Already a SHIFT customer? Ask your SHIFT representative about enabling AI Studio.

    Diana Cohen
    Diana Cohen
    Education Writer | eLearning Expert | EdTech Blogger. Creativa, apasionada por mi labor, disruptiva y dinámica para transformar el mundo de la formación empresarial.

    Related Posts

    Why Instructional Designers Can’t Afford “Good Enough” AI Anymore

    Why the AI tools your team uses — and where they live in your workflow — is becoming the new strategic question for L&D.

    The End of the Boring Compliance Course

    Why story is the most powerful tool in learning — and why most training still ignores it.

    A Legacy of Innovation: SHIFT's 17-Year Evolution Through the Brandon Hall Excellence Awards

    In the world of corporate eLearning, longevity alone is an achievement. But sustaining a track record of innovation across nearly two decades — earning recognition from one of the industry’s most respected bodies year after year — tells a different story altogether. For SHIFT eLearning, the Brandon Hall Group Excellence Awards have become milestones in a journey that mirrors the evolution of the eLearning industry itself. From 2009 to 2025, SHIFT has earned 20 Brandon Hall Excellence Awards, spanning categories that range from rapid content authoring and mobile learning to gamification, artificial intelligence, and generative AI. Each award reflects not just a product achievement, but a strategic response to the changing needs of corporate learning professionals worldwide.