The phrase "I want to, but the computer won't let me" captures a fundamental MIS truth: every information system encodes the mental models of the people who built it. Those builders had assumptions about what fields matter, what processes look like, what data should be collected, and what users would want to do. When a user's mental model of their own workflow doesn't match the developer's mental model baked into the software, the system literally cannot do what the user wants — not because of a technical limitation, but because of a human communication failure that happened months or years before the user ever touched the system.
This is why the five IS components — Hardware, Software, Data, Processes, and People — must be understood together. A technically flawless system built around incorrect assumptions about human processes and mental models will fail. The profession of MIS began with programming (hardware, software, data), but the deeper challenge has always been the last two components: processes and people. Programs encode business process logic. Business process logic comes from people. People are profoundly shaped by their mental models. Therefore IS are, at their core, expressions of human mental models — for better or worse.
The Curse of Knowledge, introduced in Heath & Heath's "Made to Stick," is one of the most reliably observed cognitive biases in organizational behavior. Once you know something deeply, you literally cannot access the mental state of not knowing it. The classic experiment: tap a song's rhythm on a table and ask people to guess the song. Tappers predict a 50% success rate; actual listeners get it right less than 3% of the time. The tapper can hear the melody in their head — listeners hear random taps. The tapper is cursed.
In MIS, this plays out constantly. An IT professional who has been building databases for 15 years cannot fully imagine what it's like to encounter a relational database concept for the first time. They design admin interfaces for themselves, not for the accounting manager who will use them once a quarter. They write documentation in jargon that assumes the reader already understands 90% of the content. They build workflows that make logical sense given their mental model of the process, but feel alien to users who have never thought about data normalization.
The professor-vs.-TA example from the slides is precise: the professor who has used Excel for 25+ years has completely forgotten the experience of not knowing Excel. The TA who learned it six months ago still has access to that confusion — they can explain it from the inside. This is not about intelligence; it's about the irreversibility of expertise.
Gartner's Hype Cycle is not a prediction — it's a pattern-matching tool. Technologies don't follow it mechanically, but the pattern of hype → disillusionment → productivity is remarkably consistent across decades. The internet hit its Peak of Inflated Expectations in 1999–2000 (Pets.com, $50B valuations for companies with no revenue), crashed into the Trough of Disillusionment in 2001–2003, and reached the Plateau of Productivity with e-commerce, SaaS, and cloud services throughout the 2010s. 3D printing, blockchain, the metaverse, and autonomous vehicles have all run similar cycles.
Generative AI, as of 2025, shows strong signs of being at or near the Peak of Inflated Expectations. Indicators: massive, speculative corporate investment disconnected from demonstrated ROI; AI job listings and salaries surging to levels that imply universal AI transformation of all work; Apollo Global Management data showing large-firm AI adoption peaking at ~14% and beginning to decline even as infrastructure investment surges; and a flood of AI startups with business models that amount to "resell OpenAI's API with a prettier interface." The professor's prediction — expect a trough of disillusionment — is consistent with these signals.
One of the most important analytical distinctions in technology strategy is between investment, adoption, and diffusion — three measures that are often conflated but tell very different stories. A company can invest $50M in an AI platform (investment), deploy it to 5,000 employees who have accounts (adoption), and discover that only 200 people use it more than once a week (diffusion). The investment and adoption numbers look impressive in a press release; the diffusion number is what actually determines business impact.
Apollo Global Management's 2025 analysis of U.S. Census Bureau Business Trends and Outlook Survey data found that AI adoption rates for large firms (250+ employees) peaked around 14% and began declining — even as AI investment by those same firms continued to surge. This is a classic anticipatory vs. actual demand gap: firms are investing based on what they believe the market will require (anticipatory demand) before actual usage by their own employees has demonstrated clear value (actual demand).
Jamie Dimon's perspective on AI (referenced in the slides) represents the measured view of a major institutional leader: AI is genuinely important and JP Morgan Chase is investing in it seriously, but the transformation will be gradual and sector-specific, not universal and immediate. This is the "slope of enlightenment" mental model — the technology is real, the value is real, but the timeline requires patience and specificity rather than general enthusiasm.
| Phase | Dominant Sentiment | What's Actually Happening | Gen AI Example |
|---|---|---|---|
| 1. Technology Trigger | "This will change everything!" | Breakthrough generates early media coverage; commercial viability unproven; products often don't exist yet | ChatGPT public release, Nov 2022; demos of image generation, code writing, essay drafting |
| 2. Peak of Inflated Expectations | "Uh, where's the ROI?" (investors) / "This will replace all jobs" (media) | Frenzy of publicity; overenthusiasm; speculative investment surges; early success stories amplified | $13B Microsoft/OpenAI; $5B Amazon/Anthropic; Nvidia to $3T market cap; AI job salaries surging to $1M+ |
| 3. Trough of Disillusionment | "Cut investment. It's garbage!" | Experiments fail; media turns negative; interest wanes; producers fail or exit; survivors improve product | AI adoption flattening at ~14% for large firms (Apollo, 2025); many AI startups failing; ROI elusive at scale |
| 4. Slope of Enlightenment | "Hmm, this is actually pretty good. I'm seeing benefits." | Practical applications emerge; best practices develop; second-gen products appear; cautious enterprise pilots | AI for contract review, customer service automation, code generation — specific, measurable, proven use cases |
| 5. Plateau of Productivity | "Yeah, pretty much everyone that needs it uses it. Solid stuff." | Mainstream adoption; market applicability clearly understood; technology becomes infrastructure | Not yet reached — likely 5–10 years away for GenAI; cloud computing is a current example of this phase |
Source: Gartner, Inc. — gartner.com/en/research/methodologies/gartner-hype-cycle; applied examples from Gallaugher Ch. 1 slides and Apollo Global Management (2025).
| Firm | AI Strategy | Key Investment / Signal | Hype Cycle Position |
|---|---|---|---|
| Microsoft | Embed AI into every product (Copilot in Office, Teams, Azure); own the enterprise AI stack | $13B for 49% stake in OpenAI; Copilot integrated across Microsoft 365 | Riding the peak; early enterprise Copilot adoption data mixed on ROI |
| Amazon / AWS | Provide AI infrastructure; build proprietary models (Titan); invest in frontier labs | $5B pledge to Anthropic; AWS Bedrock AI platform; Trainium / Inferentia AI chips | Infrastructure bet — positioned to benefit at any cycle stage if AI grows |
| Nvidia | Dominant AI chip supplier (H100, Blackwell GPUs); 80–95% high-end AI chip market share | Market cap: ~$300B (2020) → $3T+ (2024); entire revenue tied to AI infrastructure demand | Most exposed to hype cycle — if enterprise AI adoption stalls, GPU demand craters |
| Alphabet (Google) | Defend search dominance; build Gemini; operate TPU AI chips; DeepMind research | Gemini integrated into Search, Workspace; TPU custom AI ASICs; Waymo autonomous vehicles | Defensive investment — protecting core search revenue while building new AI products |
| Meta | Open-source AI (Llama models); AI for ad targeting; AI-generated content on platforms | Paying "ludicrously high" salaries for AI talent; Llama models released free to developers | Unique bet: open-sourcing creates an ecosystem Meta can monetize through its platforms |
| Apple | On-device AI ("Apple Intelligence"); privacy-first; Neural Engine hardware since 2017 | Paying extraordinary salaries for AI talent; Apple Intelligence in iOS 18; partnership with OpenAI | Most conservative — betting on local AI that doesn't require cloud, protecting privacy brand |
| Component | What It Is | Mental Model Connection | Example Failure |
|---|---|---|---|
| Hardware | Physical devices: computers, servers, phones, sensors, network equipment | Hardware choices encode assumptions about who uses systems and how — desktop-first design excludes mobile workers | Building a system on desktop hardware when all users are in the field with phones |
| Software | Programs and applications: OS, databases, enterprise apps, custom code | Every line of code is a decision made by someone with a particular mental model of the business process | Excel treating 0 as FALSE and 1 as TRUE — a mental model from 20th-century programmers baked into every spreadsheet |
| Data | Raw facts, figures, records: what gets stored, in what format, with what labels | What data gets collected reflects whose information needs were considered — often not the end user's | A CRM that captures "first name / last name" fails for cultures with different name structures |
| Processes | The workflows, procedures, and business rules the system automates or supports | Processes come from people — they reflect how someone thinks work should happen, often not how it actually happens | "I want to, but the computer won't let me" — system enforces the designer's process, not the user's reality |
| People | Users, administrators, designers, managers — everyone who interacts with the IS | People are where mental models live; the Curse of Knowledge means experts build for experts, not for users | Professor (25 yrs Excel) worse at explaining Excel than TA (6 months Excel) — curse of knowledge in action |