Finding Balance in a World of Technology

Don Sturm • November 5, 2025

​​Lately, I have been thinking about a conversation that my co-host and I had last school year with a college professor we interviewed for our podcast, CareU. She was engaging, thoughtful, and genuinely likable, the kind of guest who makes you think long after the conversation is over. At the end of the interview, I mentioned to my co-host that I would have hated being in her class. Not because the professor didn't care about learning. She absolutely did, but she banned all devices.


If I were a student in that room, I would have been lost. I've built a system that works for me. All my notes, highlights, and digital bookmarks are stored on a device, making it easy to make sense of my reading. Being told I could not use it would have felt like taking a test without my glasses. It's not that I need technology to think. It's about having developed a way of working that helps me process and remember.


That conversation made me realize something about how we handle technology as a society. We tend to react instead of reflect. We have a habit of waiting until technology creates problems before we decide what to do about it. That is how we handled cell phones in classrooms. We are doing the same thing again, only this time with AI. New technology often finds its way into classrooms before we have time to think deeply about how it should be used. Only after the problems appear do we start trying to fix them.


I enjoy using AI daily; it aids my thinking, creativity, and planning. However, I worry that tech companies are the ones setting the rules. They are building the tools, shaping the norms, and defining what responsible use looks like while the rest of us try to keep up. It feels backward. We should be deciding, as educators and as a society, how we want these tools to serve us. 


We have been here before. When devices first entered classrooms, we didn't spend much time reflecting on their purpose. We plugged them in and hoped for the best. When distraction and disconnection followed, we reacted again. We banned phones. We limited devices. We focused on control. But banning is not the same as teaching. It skips the hard work of helping students learn how to make wise choices.


That brings me back to the college professor. I understand what she is trying to do. There is value in starting analog. There is something meaningful about slowing down, using paper, and finding a sense of focus away from screens. But when an entire course completely shuts out technology, we miss an opportunity. That is not the world students live in, nor is it the one they will enter when they leave school.


This raises a question I often wrestle with. Should school mirror the "real" world, or should it act as a frontline defense against the parts of that world that we feel harm students?


In my district, we have blocked social media on student devices. I see that as a good decision. We are not pretending social media doesn't exist. We acknowledge that while it is part of the world, it is not a part that belongs in the classroom. We know the toll that endless scrolling and comparison can take, so we are choosing to protect students from that during the school day. It was an intentional choice rooted in care rather than fear.


The goal should not be to eliminate technology. The goal should be to use it with purpose.


Students need help learning to decide what works for them. Some prefer a notebook. Others highlight and annotate on a tablet. Both approaches can be valid. What matters is giving students space to reflect on their choices. When we remove that opportunity, we also remove the chance to practice responsible use.


It is about being deliberate. We do not need to chase every new tool, nor do we need to shut technology out completely. Our work is to help students learn to live and work with technology wisely, creatively, and reflectively. That is how we prepare them for a world that will not slow down, a world where technology and AI are part of daily life. The question is whether we will keep reacting to change or start shaping it with purpose.

Don Sturm is in his 35th year in education. He was a high school social studies teacher for 23 years before becoming the technology integration specialist for Morton CUSD#709. While he is passionate about all things education, his main interest is helping educators reflect on their classroom practices. He is the co-host of three podcasts, Through the Educational Looking Glass, which focuses on teaching and learning, CareU which focuses on helping those raising kids be more intentional in their practices, and That’s the IDEA! which dives into fresh perspectives and innovative ideas in education. 


Don has been married for 37 years and has two grown children. He loves to travel and read.

RECENT ARTICLES

By Betsy Monke October 29, 2025
This blog, written by an IDEA Mini-Grant recipient, discusses the idea behind using robots during reading lessons to support literacy and increase reading scores.
By Emily Vertino October 22, 2025
ChatGPT, a form of generative Artificial Intelligence, more commonly referred to as AI, popularized amongst students my freshman year of high school. My at-the-time English teacher was the first to notice that all of a sudden, freshmen—who had never taken a high school-level English course—wrote as advanced as a college professor. He pointed out that students who showed high school-level hand-written papers were able to properly use an em dash and focused on parallel structuring solely on their online assignments, a feat he had not seen in freshmen before. It became natural for teachers—from freshmen classes to senior classes—to connect that students using emdashes or specific words—delve, deep understanding, crucial, elevate, resonate, enhance, tapestry, foster, endeavor, enlighten—had used AI in their paper. After a few months of teachers reporting that students began scoring exceptionally well on papers, my school implemented an application called TurnItIn, ironically, another generative AI that reviewed paper and scanned for “proof” of AI generated text. The issue started once TurnItIn accused students who properly incorporated a citation into an essay plagiarized the text, ignoring all credit given to the original author and the research done by the student. Needless to say, we switched back to teachers reading papers and discussing with the students themselves if there was suspicion of AI incorporation and my school made a policy about “AI Academic Dishonesty”. Even amongst my peers in the classes with the highest rigor, there are countless kids who incorporate AI into their school work. Be it through having ChatGPT solve their calculus problem or Chemistry problem, AI is widely incorporated, which causes a noticeable shift in their critical thinking capabilities. Rather than spending thirty minutes struggling through a derivative problem on their own, they immediately refer to having ChatGPT solve it and copy the answer down, depriving them of critical understanding of the problem and the method used to solve it. General conversation is shifting too—my hallways are full of students misusing words or bragging about how ChatGPT landed them an A in a specific class. This isn’t to say I’m against AI—because I truly believe proper use of AI can be more beneficial than harmful—but as it is now, generative AI devices are damaging the development of my peer’s brains and there are dozens studies showing that generative AI, specifically Elon Musk’s Grok, is ruining the ecosystem of Memphis. I also find that the use of the resources around me has gradually decreased. When I was a freshman, my school used a center court to hold a resource center for all subjects—on top of every teacher having office hours for an additional 30 minutes after every day—and it quickly became a hot spot for students. I write fanfiction during my free time so I was actively inside the court, having English teachers proofread my work and discussing my ideas for the next scenes. I also went in to simply talk to teachers, but that’s beside the point. Each day I was in our resource court, it was filled with students coming in for support—be it math, English, science, history, or a language—and truly working on bettering their understanding of the subject. However, now, as a senior, we only have a math resource center (MRC) that operates full-time and a science resource center that operates during the first 40 minutes of a class. My school no longer has an English resource center for students that need help and for those who do, even office hours are a 50/50. As mentioned before, teachers stay for 30 minutes after school—with the exception of teachers who supervise clubs or sports—which is far too short for English teachers that have dozens of students coming in for English support. A select number of teachers introduced an appointment scheduling simply because of how busy their office hours are, while other teachers have students who only come in the day before a summative. The teachers I know became teachers because of their love for helping students, yet my peers are dismissing all help from their teachers in favor of ChatGPT, who isn’t even correct 100% of the time. This phenomenon occurs with reading, too. I’m an avid reader—most of my favorite novels have multiple volumes with hundreds of chapters (my all-time favorite has 1,400 chapters for the first book alone; the second book has another thousand), and a growing issue I’ve noticed as AI grows is that my peers use AI to summarize documents. For example, Connected Papers has been recommended to me by my closest friends and once I googled it, I found that it uses AI to web-browse for articles similar to a paper currently being read and labels key points that correlate to your current article. AI is useful; essentially, AI isn’t inherently harmful and there are proper uses for it, but the misuse of AI continuously outweighs the benefits. In the above instance, having a resource capable of easily accumulating sources in a similar field of interest shaves off time spent scouring online and leaves that time for additional revisions, which is beneficial, but the most common use of AI is completion, not assistance, when it should be the opposite.
By Member Engagement Committee September 10, 2025
IDEA has launched a pilot Slack community to provide its members with a space to create consistent and meaningful connections with like-minded peers.