Navigating Complexity: Why Data Must Embrace Geographic Rural Realities

Jeremy Rinkel • December 6, 2023

I recently attended a meeting discussing the state of Computer Science and the opportunities for students in Illinois. Despite some missing data from smaller rural districts, the research was profound, revealing a disparity in the opportunities and offerings of CS courses throughout the state. This disparity in not only opportunities, but also data which leads to an unclear picture of the state of CS in Illinois as well as the rest of the country. 


Using data to make decisions in education isn't a groundbreaking idea. Think of data as a storyteller – it can be tricky, sometimes showing a biased perspective based on where and how the data is collected and analyzed. Some of this data can be twisted intentionally as well as accidentally adding to the unclear picture of the current state of the concept being studied. 


Let’s explore three challenges to data collection and data driven decision making with CS in Illinois being the focus.


Geographic Diversity

There is not a one-size fits all when it comes to CS opportunities and offerings in Illinois. Ignoring geographical and cultural distinctions can lead to an increase of challenges as well as increased inequalities. Lack of resources, including internet connectivity, can put rural communities at a disadvantage in relation to CS opportunities. What may work for districts with an abundance of resources may not be the case of rural districts. 


Limited Resources

Rural areas historically are left behind when it comes to educational funding. School districts have financial constraints not only monetarily, but also deficits when it comes to infrastructure. In some parts of our state affordable and reliable internet is still an issue.This disparity can lead to a lack of essential data to inform our decisions when it comes to CS opportunities. 


Lack of Shared and Timely Data

With extensive data sets comes the issue of the lack of timely and actionable information for decision-making processes.“The amount of data does not determine the potency of the insight you can gain from it. People often think that data alone has value, and that is an expensive misconception” (Thompson, 2023). Insufficient data sharing among numerous sources and surveys conducted by diverse agencies and organizations results in disparate narratives conveyed by the data. 



Final Thoughts

Data, while a powerful storyteller, can also be misleading, often influenced by the context of its collection and analysis. Examining the challenges to data collection in the context of Computer Science education highlights the crucial need to consider geographic diversity. Neglecting cultural and regional distinctions can perpetuate inequalities, especially concerning limited resources in rural areas, where educational funding and infrastructure may be lacking. Furthermore, the issue of timely and actionable information emphasizes the importance of shared and comprehensive data to make informed decisions in shaping Computer Science opportunities for students. Addressing these challenges is paramount to fostering an equitable and effective educational landscape for the future.

Sources:



Notes: 

  • Images were created utilizing the website Ideogram.ai, a text to image AI tool


  • ChatGPT 3.5 was used to organize and improve the wording of my original thoughts


  • ChatGPT 3.5 created the title of this post

Jeremy is a TEDx Speaker and a Jr. High Computers & STEM Teacher in Effingham, IL. He has earned a Masters in Educational Policy from the University of Illinois and a Masters in Teaching from Greenville University. His goal is to inspire students, teachers and anyone he comes into contact with to be a lifelong learner. Jeremy believes education is the key to solving our world’s problems. In his free time, Jeremy enjoys traveling,writing, spending time in coffee shops, and spending time with his family watching old TV shows on Netflix.

RECENT ARTICLES

By Betsy Monke October 29, 2025
This blog, written by an IDEA Mini-Grant recipient, discusses the idea behind using robots during reading lessons to support literacy and increase reading scores.
By Emily Vertino October 22, 2025
ChatGPT, a form of generative Artificial Intelligence, more commonly referred to as AI, popularized amongst students my freshman year of high school. My at-the-time English teacher was the first to notice that all of a sudden, freshmen—who had never taken a high school-level English course—wrote as advanced as a college professor. He pointed out that students who showed high school-level hand-written papers were able to properly use an em dash and focused on parallel structuring solely on their online assignments, a feat he had not seen in freshmen before. It became natural for teachers—from freshmen classes to senior classes—to connect that students using emdashes or specific words—delve, deep understanding, crucial, elevate, resonate, enhance, tapestry, foster, endeavor, enlighten—had used AI in their paper. After a few months of teachers reporting that students began scoring exceptionally well on papers, my school implemented an application called TurnItIn, ironically, another generative AI that reviewed paper and scanned for “proof” of AI generated text. The issue started once TurnItIn accused students who properly incorporated a citation into an essay plagiarized the text, ignoring all credit given to the original author and the research done by the student. Needless to say, we switched back to teachers reading papers and discussing with the students themselves if there was suspicion of AI incorporation and my school made a policy about “AI Academic Dishonesty”. Even amongst my peers in the classes with the highest rigor, there are countless kids who incorporate AI into their school work. Be it through having ChatGPT solve their calculus problem or Chemistry problem, AI is widely incorporated, which causes a noticeable shift in their critical thinking capabilities. Rather than spending thirty minutes struggling through a derivative problem on their own, they immediately refer to having ChatGPT solve it and copy the answer down, depriving them of critical understanding of the problem and the method used to solve it. General conversation is shifting too—my hallways are full of students misusing words or bragging about how ChatGPT landed them an A in a specific class. This isn’t to say I’m against AI—because I truly believe proper use of AI can be more beneficial than harmful—but as it is now, generative AI devices are damaging the development of my peer’s brains and there are dozens studies showing that generative AI, specifically Elon Musk’s Grok, is ruining the ecosystem of Memphis. I also find that the use of the resources around me has gradually decreased. When I was a freshman, my school used a center court to hold a resource center for all subjects—on top of every teacher having office hours for an additional 30 minutes after every day—and it quickly became a hot spot for students. I write fanfiction during my free time so I was actively inside the court, having English teachers proofread my work and discussing my ideas for the next scenes. I also went in to simply talk to teachers, but that’s beside the point. Each day I was in our resource court, it was filled with students coming in for support—be it math, English, science, history, or a language—and truly working on bettering their understanding of the subject. However, now, as a senior, we only have a math resource center (MRC) that operates full-time and a science resource center that operates during the first 40 minutes of a class. My school no longer has an English resource center for students that need help and for those who do, even office hours are a 50/50. As mentioned before, teachers stay for 30 minutes after school—with the exception of teachers who supervise clubs or sports—which is far too short for English teachers that have dozens of students coming in for English support. A select number of teachers introduced an appointment scheduling simply because of how busy their office hours are, while other teachers have students who only come in the day before a summative. The teachers I know became teachers because of their love for helping students, yet my peers are dismissing all help from their teachers in favor of ChatGPT, who isn’t even correct 100% of the time. This phenomenon occurs with reading, too. I’m an avid reader—most of my favorite novels have multiple volumes with hundreds of chapters (my all-time favorite has 1,400 chapters for the first book alone; the second book has another thousand), and a growing issue I’ve noticed as AI grows is that my peers use AI to summarize documents. For example, Connected Papers has been recommended to me by my closest friends and once I googled it, I found that it uses AI to web-browse for articles similar to a paper currently being read and labels key points that correlate to your current article. AI is useful; essentially, AI isn’t inherently harmful and there are proper uses for it, but the misuse of AI continuously outweighs the benefits. In the above instance, having a resource capable of easily accumulating sources in a similar field of interest shaves off time spent scouring online and leaves that time for additional revisions, which is beneficial, but the most common use of AI is completion, not assistance, when it should be the opposite.
By Member Engagement Committee September 10, 2025
IDEA has launched a pilot Slack community to provide its members with a space to create consistent and meaningful connections with like-minded peers.