Helping Students Build Their Three-Dimensional Selves in the Age of GenAIIn Alan Moore’s BBC Maestro course on writing, he talked about building authentic characters. He discussed how one-dimensional characters were either “goodies” or “baddies,” that everything about that character could be attributed to this label. He then discussed two-dimensional characters, which were one-dimensional characters with some standout physical or emotional trait, a limp or a problem talking to women. Three-dimensional characters, the gold standard, required going much further. The writer needed to know the characters inside and out so that the character possessed his or her own agenda and agency in which to work towards that agenda. As I was listening, I found what he had to say about creating characters was largely applicable to building curricula. Early in my career, I worked with curricula that sorted the “goodies'' from the “baddies.” I’ve also worked with curricula that assigned an additional trait: a comment from a comment bank or an assigned letter grade for effort and participation–which often boiled down to whether a student was likable or not. It wasn’t until I started with a personalized, competency-based approach, that I really got to working with that gold standard, where I learned so much more about my students, where they learned so much more about themselves, and where I learned how to better help them along their way. Helping Students Find Their Way We spend a lot of time talking about purpose in class. It’s also the competency strand that I have the loosest measurements of. I simply ask students to construct a statement that defines a long term goal connected to personal interests that positively impacts others. Most students do not have a clue about this, so the intermediate stage is a more modest commitment to an interest, some goals they can connect to it, and some exploration on how that work might contribute positively to others. Despite the loose evidence, it’s the outcome that usually leaves the biggest impact, even if it takes years to come to fruition. I like to introduce it with a riptide analogy–if nothing else, they get the benefit of a free public service announcement. In a riptide, I say, the important thing to remember is that your purpose is to get back to shore, it’s not to beat the riptide. My shorthand for this was to repeatedly ask, “What’s your shore?” The purpose of the metaphor was to get students to accept momentary limitations and to move perpendicular to where they think they should be able to go so that they can reposition themselves for where they truly mean to go. It’s a hard lesson to learn, and hundreds die from riptides each year. As for purpose, I have no idea how many people succumb to the consequences of forgetting theirs or not having one in the first place, but in the US alone, the rampant rate of depression, addiction, and suicide, particularly with teenagers, might give us an inkling. In this work, a number of students come to mind. I received an email recently from a former ninth grader. She wrote to reflect on how the competency-based approach had impacted her, that she eventually grew to like arguing for her grade, but that she had felt that the purpose competency strand was useless. Now considering her options for university, she wanted to tell me that she felt differently about that now, that even though nothing measurable had come from reflecting on purpose during our work together in ninth grade, the focus had planted a seed, and maybe that work had been the most important thing we did all year. Another particularly challenging student who comes to mind fought me tooth and nail on purpose. He told me that he had no idea what this had to do with an English class, and that there was no value in it. He was good with computers, and he’d get a computer science degree, and he’d make bank. We compromised on him sharing an interest, maybe computers, maybe something else. He said he liked music and that he’d like to learn piano. As I allow students to bring in evidence from outside class for their competency interviews, I thought this was a great opportunity. We had a program at school called 88 Keys, which was an opportunity for students who had no piano experience at all to pair with a proficient player and commit to learning and playing a piece for a concert. This student immersed himself in the piano, so much so that he got busted for being out after curfew because he was listening to Rachmaninov in the dorm’s courtyard. When the discipline committee got the details, they let him off with a light cleanup duty. For the concert, he decided to play Bethoven’s Moonlight Sonata, and there was a professional pianist in the audience. The professional told him he had a lot of mechanical things to improve upon, but that he was genuinely impressed with the skill level he’d risen to in this limited amount of time. When he told me this, I couldn’t help myself; I asked him if he’d found his shore? And he promptly told me what I could do with my shore. When we met in the fall of the following year, he sought me out. He was still going to apply for computer science, but he was adding a music minor. He made it clear that this was not a shore, but he felt he needed to say thanks. These stories could easily not have been told. There’s so much that works against this kind of thing, so much that wants to make objectives into singular, linear paths. In a private school context, parents pay a lot of money for this education, and they need the expense to be justified with a career path, as much as that is still possible today. The university programs that students want are competitive. The demands of high school are a lot to handle. Their peers are always eying their grades. Teachers all feel that their class is the most important and load on the homework accordingly. But amidst this all, each of our student’s purpose is different from every other student in that class, and the skills, knowledge, and attitudes that they need to work towards that purpose are also different. As such, does this linear path towards “success” make sense? Digging Deeper Our “growth orientation” and “citizenship” competencies took a while to get right. To be fair, it takes the better part of a career to be able to distill experiences with students and align them with course objectives, department objectives, and a school’s vision and portrait of a graduate. It’s the kind of thing we’ll probably always be tinkering with. A lot of what is in these strands are what are often called soft skills. I remember being CAS Coordinator (the IB’s co-curricular portfolio–creativity, activity, service) at a previous school and giving a presentation to parents about requirements and expectations. At the time we were using the IB’s Learner Profile in the way that I now use competencies, and I referred to many of them as soft skills. A parent asked me bluntly, “Do you teach these things?” I had to be honest, “No, we don’t.” To be fair, they are not as easy to teach as content. Give some facts, teach a skill, ask a student to replicate that in a novel context. Many of my colleagues would argue with me and say that soft skills are implied, that if the student performs well on a given task, then they have also learned these as well. That never seemed right to me. One of my DP Language and Literature students kicked the doors open on that notion. He was a high performing student, bright, skillful writer, but his work was always late. We had a policy, which I agreed with, that stated that lateness should not impact the score given on an assignment. But it really felt like this student was abusing this policy. Both his peers and I expressed repeated frustration and disappointment, so he began ducking us and missing classes. Ignorant of the circumstances, the narrative that we’d all built was that this student was lazy and abusing a forgiving system. When he and I finally sat down to talk about it, I learned a very different story. He’d done all of the work, multiple drafts in fact, but none of them measured up to his expectations of himself, or those of his parents, so he kept starting over. The kid wasn’t lazy, which never felt accurate; he was a perfectionist. Since we didn’t have any course outcomes pertaining to this, he didn’t know that about himself, I didn’t know that about him, and I was not in a place to help him. He was just someone who complicated the “goodie/baddie formula.” That learning experience for me led to our current risk-taking competency strand: While I think I’ll forever be tinkering with these competency strands and their descriptors, I do feel that we are now at a place where we can assess a three-dimensional character. To get a sense of how these interview assessments work, I left an excerpt of a recording from a former ninth grade student in the comments who has since moved on to university. She was a high-performing student who, through her competency-based work, became much more aware of how her strengths masked weaknesses and the further work that needed to be done towards having the skills, knowledge, and attitudes to pursue a purpose she hadn’t yet committed to.
How GenAI Factors Into This While I feel that I am close to realizing my SciFi dreams of having an AI assistant, my students are less enthused. There may be a few reasons for this. For one, as good as the ChatGPT voice mode is, students might be coming up against some form of uncanny paradox. They are different from my generation; they don’t do phone conversations, and much of their lives are already in front of screens that they might be less motivated to add more. Also, GenAI doesn’t lend itself to passive, algorithmic consumption the way social media does. Returns are dependent on what you input, and my students don’t quite know what to look for, and, if they do, they don’t yet have the drive or the patience to keep iterating in order to find it. “Digital Natives” though they may be, it might take some time and patience for them to warm to this new technology, and it might be those who remember a time without these tools who guide the way. There are for sure concerns about how this technology will be misused for immediate and undeserved gains, but there is also incredible potential for individualized and personalized growth. This is going to take thoughtful policies on the part of administrators, and it is going to take constant monitoring and reflection by teachers to approach a comfortable balance. If nothing else, perhaps this disruptive technology has become a watershed moment for us that education needs to change. A personalized, competency-based approach built on universal design for learning offers us a solid way forward. Time to Pause and Reflect This has been an eventful year, and like many, I need a little downtime. I’m sure there will be new announcements to come this summer that might further disrupt how I am thinking about things. Foundationally, though, I feel that we are in a good place. We have a competency-based approach and appropriate student outcomes to work towards in order to support student development. After taking a break, I plan to “attack my assessments” with GenAI. Some will make the cut, and some will get binned, some things will be done in class, some will be done at home, some will be done without GenAI, but some will require students to use it. That means I’m going to need to play with these tools some more and teach my students how to work with them, and on the teams that I am a part of, I’m going to need to help my colleagues learn how to use them as well. We’re all going to need to understand these tools so that we can have meaningful conversations that allow us to find a balance between trust and accountability with student usage of these tools. That work will be grounded in UDL principles (rubric attached in comments) so that we can redesign courses where…
With both summer reading and the lego piece that I started my day stepping on in mind, I’ll end on an instructive excerpt on legos from one of my favorite books: Sophie’s World by Jostein Gaardner: “Why is Lego the most ingenious toy in the world? For a start, Sophie was not at all sure she agreed that it was. It was years since she had played with the little plastic blocks. Moreover she could not for the life of her see what Lego could possibly have to do with philosophy. But she was a dutiful student. Rummaging on the top shelf of her closet, she found a bag full of Lego blocks of all shapes and sizes. For the first time in ages she began to build with them. As she worked, some ideas began to occur to her about the blocks. They are easy to assemble, she thought. Even though they are all different, they all fit together. They are also unbreakable. She couldn’t ever remember having seen a broken Lego block. All her blocks looked as bright and new as the day they were bought, many years ago. The best thing about them was that with Lego she could construct any kind of object. And then she could separate the blocks and construct something new. What more could one ask of a toy? Sophie decided that Lego really could be called the most ingenious toy in the world. But what it had to do with philosophy was beyond her. She had nearly finished constructing a big doll’s house. Much as she hated to admit it, she hadn’t had as much fun in ages. Why did people quit playing when they grew up?” Perhaps GenAI is a box of lego pieces. Some of us will dive right in and build. Others will need some directions to start with. There’s satisfaction in building both ways, and many build very successful lives and careers by doing little more than the latter. But imagine a world in which that was all that there was, assembling pieces to someone else’s design. If we want more of the former, then we need to trust our students to take greater ownership of their learning. We need to observe them, learn from them, and support them along the way so that they can succeed in their work of building their own three-dimensional selves. I wish everyone a restful summer holiday full of joy and play and wonder. And I welcome continued collaboration as we enter what I’m sure will be another eventful year. For now, it’s time for me to get back to work on my own three-dimensional self.
0 Comments
"Contra" vs. "Rogue" and How an Old Gaming Beef Might Shed Light on How Educators Hold EdTech To TaskThere has been some buzz surrounding the potential for EdTech to transform learning through gamification. It makes sense, and I greatly value games in learning. Nearly all of my students enjoy video games, and if more traditional schooling can’t compete with gaming, perhaps schools should adapt gamification to improve engagement. But if we are going to do this, I thought it might be worthwhile asking ourselves what further gamification of learning might look like with advances in GenAI. One of my favorite video game experiences as a kid was beating the Nintendo game "Contra." Up, up, down, down, left, right, left, right, B, A, select, start. Code inputted, I was now in side-scrolling, level-clearing, alien-killing bliss. With a friend, we would do what we could never dream of doing with our measly three little lives…we would beat the game. Our engagement was held by the novelty of experiencing each new level and its respective boss. That took us all the way to the end where we got to experience the euphoria of taking down the big boss, which would cue a pixelated explosion and a triumphant chiptune melody. Once we’d beaten the game, though, it never really occurred to us to try to do it with only three lives. That was well outside the zone of proximal development for me and my friends–cue the Luke Skywalker “That’s impossible” meme. And besides, we’d already seen what happened at the end, so there was nothing really to engage with unless you were into looking for Easter Eggs, which we weren’t. For us, the objective was to keep scrolling to the right until there was no more scrolling to do. As there was nothing more novel to experience, our engagement was sapped, we got bored, and we moved on. Now compare that experience to the PC game "Rogue," a dungeon crawler that randomizes resources and challenges. When you die, you go back to the beginning, armed with the experience that you will have to apply to novel circumstances. The game has a cult appeal that spawned the creation of various other games, where the player’s avatar is re-skilled and retooled for adventures they failed at and get to return to as a stronger version of themselves. The objective is still to complete the game, but there is re-playability because the player’s evolving skills are applied and adapt to changing circumstances. As novelty is retained and there are multiple pathways to achieving the goal, engagement remains. You eventually get bored of the parameters that the game allows for, but the experience is arguably more robust. I mention both of these games because in the GenAI age, Edtech is repeatedly offering promises of gamification that would “personalize” learning. I sat in a presentation the other day where the speaker repeatedly quoted John Dewey’s constructivist, Rogue-like, model of education while pointing to a linear, side-scrolling, articulation of Bloom’s Taxonomy. In my early explorations of personalized learning, there was a clear distinction between personalized learning and individualized learning. Individualization meant the student went at their own pace through a fixed path towards a fixed end. Personalization was something different. It could encompass an individualized approach at times, but the purpose was to have students construct their own paths and meaning towards a combination of external and personal objectives. Though not a perfect analogy, individualization is to "Contra" what personalization is to "Rogue." But in the branding and marketing of Edtech services, these terms are repeatedly being conflated. Rogue and what was Rogue-like went through similar conflations, and the disagreements came to a head at a Berlin Conference in 2008 where they hammered out 15 aspects consistent with Rogue-like games. While that might seem like a trite point of comparison and a needless pursuit of purism, what you had was a group of gamers who had experienced something different, and they wanted to push others to build in that vein. When games compromised those aspects but still called themselves Rogue-like, they corrupted that pursuit, thereby corrupting the sensibilities of the users. For such sins committed after that conference, falling even one aspect short of the requisite core led your game to be labeled Rogue-lite. With this in mind, which of the two game models above would best serve the needs of our students in today’s world, and which one are Edtech companies currently offering us? It’s primarily individualized, which is fine, but they are marketing them as personalized, which is not. Perhaps education is in need of its own Berlin conference so as to clearly define what is meant when the word “personalized” gets brandished. If that happened, I wonder what kinds of products and services we might have at our disposal, particularly if they were seeking to avoid the ignominious labels of “Personalized-lite” or “Individualized”? Imagine the EdTech version of "Rogue," where students mapped out their own goals, leveled up on formalized side quests, collaborated with others, failed, regrouped, retooled, re-skilled, tried again...perhaps choosing to alter the path and/or objective they’d originally set out upon. Now imagine the Edtech version of "Contra," side-scrolling to a fixed end but getting the help you need along the way. Both have their place, but clearly they are not the same. Regardless of what EdTech offers us, it will be up to teachers and administrators to build systems that can responsibly incorporate what is currently available and what will become increasingly available soon. We need to explore and better understand the capabilities of this technology, and we need to think about what it is that we want for our students. Individualization has its place, but personalization is the gold standard, and we need to know and recognize that distinction. Next Time: In next week’s piece, the final one before the summer break, I will show some of the benefits that the latter has had on my students and how thoughtful incorporation of GenAI into a personalized learning structure can potentially take them even further. Assessing and Grading in Competency-Based Learning and Leveraging GenAI for SupportMy favorite part of our ninth grade student interviews was during their discussion of “Growth Orientation.” This was part of the grading process my team and I designed. We broke this particular competency down into seven parts: purpose, resourcefulness, planning, time management, motivation, risk taking, and reflection. What students had to say during this part was almost always insightful and substantive. We started with purpose, which we defined as a “long term goal, connected to personal interest, that positively impacted others.” Most of our ninth grade students fell into the “progressing” category for this one. They would say that they were actively exploring interests and considering long term goals related to those interests, and how those goals might positively impact others. All the while, they would be tasked with providing evidence of the steps they were taking to find that elusive purpose. We’d then move to resourcefulness. Did they know how to find answers to the questions they had, or were they reliant on teachers to provide the answers for them? Did they generate their own questions? My repeated line with my students was that if they still needed me at the end of the year, then we had both failed. The middle chunk was especially enlightening, particularly for students who struggled to get work done on time. Some students would say that this was a planning problem, others would say that they made plans, but they couldn’t stick to them, and others would say they had plans, and they could stick to them for a time, but then they would lose motivation midway through a semester. For others, they would tell me that they had done the work, but they weren’t happy with it, so rather than take a risk at getting negative feedback, they just wouldn’t turn it in. These are all very different issues that require different strategies to address them, and yet how often are students who don’t turn in work reduced to being called “lazy.” To argue for their level of mastery in each of these strands, students would often use evidence from Student Directed Learning Time, a block of time each week where students, under our supervision, took ownership of their learning. As for reflection, by the end of their talk on these first six strands, teachers and students would clearly see how things were progressing. This type of experience and the demands leading up to it give students a better understanding of themselves. It also allows teachers a broader range of opportunities to better understand their students and to offer them feedback to assist their growth. Assessment of, as, and for Learning Traditional alpha-numeric grading systems, which assess only course content and narrow skill sets, are painfully reductive in comparison. In a competency-based system, even if students perform well on a given assessment, demonstrating mastery in several of the Communication strands, they might have struggled with strands of Growth Orientation. Perhaps they didn’t take sufficient risk. Or in their Citizenship...perhaps they were not being particularly gracious giving or receiving feedback during writing workshop. Focusing on competencies and student-led interviews, in particular, enhanced my effectiveness at personalized learning. I believe GenAI has the potential to further improve upon this practice, and I will highlight those areas below. But before that, I’d like to elaborate on how this grading process worked in the school we created it for, and how I adapted it for a more rigid system. School A I started working at this school immediately after my experience with the IB. I had enjoyed working with the IB, and I felt its best-fit grading process was a game changer for keeping students motivated until the end and getting them to take necessary risks throughout the year. The problem that I find with the IB, though, is that there is too much language to make it meaningful to students. Four criteria in six subject groups, 10 Learner Profile characteristics, and five ATL, each with a page or more of descriptors. That can lead to a lot of dusty language hanging on the walls. Most of our 9th grade students in that school came from one of two environments: a gradeless, competency-based middle school or a high stakes testing school. Our team’s job was to build a bridge into the high school that accounted for these differences, while preparing students for the rigor of content-heavy AP exams in the later grades. And we needed to make this work with a traditional percentage grading system. We addressed this challenge by building competencies based on the school’s ‘Portrait of the Graduate.’ We also borrowed from the IB Learner Profile and Approaches to Learning. We broke that language down as best we could into the following competencies: Communication, Growth Orientation, Citizenship, and Thinking. We further divided each competency into 5-7 strands and created 'I can' descriptors for each, assessing them on a mastery scale–Mastery, Almost There, Progressing, and Not There Yet. Each strand had a corresponding point total to their level of mastery, and the total score for each of these strands was the grade out of 100. It was clear but complicated enough that we stopped talking about grades and started talking much more about each of the competency strands. We determined grades entirely on four competency interviews, and it was the responsibility of each student to collect portfolio evidence to argue for their level of mastery in each strand. It was a reasonable compromise for all parties. If students asked what their grade was, we kept telling them that their score was the total of their best argument with evidence for each competency strand. With Canvas as our LMS, its mastery gradebook allowed us to keep track of the individual assessments, giving us data on when to push or intervene. Students and parents also had access to this data. School B Transitioning to School B required some compromise. It is a larger, national school that offers the IB’s DP program for students in grade 11 and 12. As this is a national system where students are ranked according to their grades, the grading process can be quite rigid, and students and parents tend to be hyper focussed on their scores. At the same time, the school is committed to a competency-based focus. The challenge, then, is how to integrate these competencies into a system with heavy grade pressure. In grading environments such as these, tension tends to run high, and anything new can feel like it will crush everything beneath it. Unlike the previous school, where we made the competencies the course’s criteria, grades here are attached to existing criteria. There are also strict percentages to determine average grades and heavy moderation of grades between classes. The only flexibility was the 'Effort and Participation Score. It was only 10 percent, but in a grade heavy environment, that was still enough to get my students’ attention. So I took the competency-based structure I’d previously created and adapted it to the school’s competencies. In this case, I removed the numbers from each strand and created a rubric emphasizing risk-taking and honest reflection, characteristics that are rare in high-stakes assessment environments. I then made these competencies the core of the course, highlighting weekly strands and adapting the curriculum to focus on these. The effect? It worked. Students' reflections were excellent and had a strong impact on their performance in future classes. Administrators and colleagues took notice, and there are discussions about expanding this into other areas next year. Most importantly, we found meaningful ways of getting language off the walls and into the minds and experiences of our students. The easiest thing to have done this year would have been to drop this approach, but that’s really not a possibility for me anymore now that I see the impact of such work. In this case, we reached a reasonable compromise, and I’m excited to build on this next year. Leveraging GenAI to Assist with Assessment: I’ve had some early successes integrating GenAI into this process, and I’m looking forward to having a little more time this summer to explore how else I might do this. I welcome recommendations. Reflections and Practice Interviews: I uploaded the competencies into a GPT along with documentation on habit loops and other tools I use in class. Students can interact with this GPT in a number of ways. They can identify a strand they are struggling with and get some help. The tool will then ask the student to describe their experience with that strand. The student might then be prompted to recognize how habits might impact this, and the specific cue that is triggering the problem behavior. Having gone through this reflection, they can get help creating a new SMART goal that they can work with on their own or with me during Student Directed Learning Time. Students can also do an interactive self-assessment on the competencies or do a practice interview, receiving immediate feedback on their performance. It works with cooperative students... And those who are less inclined… Feedback
“Virtual Mr. Pultz” is performing well. As it is trained on the rubrics, resources, and banter I provide, students can use him to get feedback in the style that I usually give. I’ve played around a bit with having it grade student writing, but as Leon Furze and others have reported, there are inconsistencies that make this unworkable…at least for now. Simulations, Tools, Games, Exploration Both the Cross-Cultural Navigator and the Negotiation Tool, which I adapted from Ethan Mollick’s prompt, are performing well. Students are using these for novel opportunities to practice for their speaking exams. I’ve also used Student Directed Learning Time to have students create their own GPTs. They created a fun murder mystery role-playing game that takes the names of all the participants, assigns them roles in a murder investigation, and plays out the drama. They see holes in the design, and we discuss how we might adjust the prompt to eliminate that hole. Students can use evidence from these experiences to argue for mastery in a number of our competencies. Next Time: Next week I’ll wish everyone a nice summer break and conclude this series with the article I’m most excited about: Giving Students the Space and Structure to Surprise You. In that piece I’ll also discuss some of my students’ thoughts on GenAI. Taking the Leap of Faith into Personalized LearningMy team and I had just finished presenting at an education conference on the work our school was doing with personalized learning. The session had gone well, and many hung back to talk. Some communicated being inspired; others expressed frustrations about wanting to do this work, but not being able to. Our school had committed to school wide personalized learning initiatives that looked different at each level. This was the early stages of the personalized learning movement, but being a full IB school, we were committed to a holistic core in each program. It was an easy fit for the primary (PYP) and middle years programs (MYP), but there was anxiety about doing this with content-heavy courses in the Diploma Program (DP). For our Diploma Program (DP) students, the school made the decision to create something called Student Directed Learning Time (SDLT), which, at a glance, could be mistaken for study hall, but the idea was that time was given for students to work on what they needed to work on at their own pace. The goal was to give students a structured space where they could learn how to learn with teacher support. As a consequence of this approach, all courses, including the content-heavy ones, had their contact hours cut. To support students, teachers were encouraged to flip content and build self-directed modules where students could demonstrate their learning at their own pace. At the time, it was bold, risky, and ambitious.
As we were about to leave the conference room, a head of one of the more prominent international schools in the region shared his thoughts: “You guys are very lucky that your scores didn’t drop.” Taking the Leap At the time, I think I quipped something like, “Who brought Captain Sunshine,” but over the years I’ve come to appreciate more the stresses that administrators are under. Putting kids at the center of learning seems like an obvious point, but when those same kids are competing for spots within a given system, and those systems allocate numbers which grant you privilege, and that system and those numbers are not going anywhere, then the obvious point becomes moot. There are changes happening. Mastery Transcript Consortium has recently partnered with ETS, and a more legitimate alternative to the grading factory is gaining recognition. But the process is slow. Grades are a legitimate part of the fears, not because of what they are, but because what they have been made to represent. I’ll go into how I’ve worked with and around grades in greater detail in another article, but I want to acknowledge that this is a genuine concern that can derail a number of institutions from achieving their missions. It also caps the potential of teachers who are forced to showcase narrow approximations of student achievement. And, most importantly, it alienates students from a more complete understanding of themselves and where they might find belonging and purpose in this world…our business in the first place. So I’m grateful to have been a part of an institution who first took this leap for me. It granted me a richer and deeper experience of teaching than being master of content could ever have afforded me. Once experienced, it’s not something that is easy to walk away from, and in a later piece I’ll discuss how I’ve kept this focus despite being in systems that do not openly support it. But even being a part of a system that made this decision for me, it was still a big leap to make. First PD Session We were shown a clip of two orchestra conductors. The first was the classical picture most are accustomed to: older white gentleman in tails, waving a baton with a fierce look. His orchestra played well, moving to the beat of the conductor’s baton. The second was more progressive: a younger gentleman of color in casual attire who got things started, and then let the smiling kids take over as he removed himself from the stage. The first was a picture of discipline as the orchestra conformed to the vision of its leader; the second was a picture of personalization. We discussed this a bit, and I wondered privately if it had to be one or the other, but we were basically told to make our classes like the latter. First Days I didn’t sleep well the night before my first class. I never usually do, but this night the energy was more anxious than nervous. I was going into class with a completely flipped script, and I was starting with my syllabus. In previous years, I’d take advantage of the honeymoon period and talk at them for the duration of the period, cramming in every last detail of the course and classroom expectations. And because I said it, I could tell myself that the kids know it. But now I was giving students the syllabus and giving them a series of questions to answer at their own pace. A true master of personalized learning, right? Not really. Like many teachers who start with personalized learning, what I’d started with was something I’d already done before, which was individualized learning. I set the objectives, I remove myself from the stage, I create the module, and the students navigate what I want them to learn at their own pace. Class is often automated, and I can shift my attention to individuals. Not bad, and I think this kind of practice has a place in personalized learning, but we can do better. In the school’s SDLT initiative, we met with mixed results. Our disciplined students made good usage of this time and sought out their teachers when they needed assistance. For our less disciplined students, it was a battle, one consistent with usual issues with study hall. We measured our students’ performance on the usual course criteria. We made overtures to Approaches to Learning (ATL) and the IB’s Learner Profile, but it came across as a bit hollow. But as I mentioned earlier, our scores remained the same, so we were encouraged to commit to this approach and improve upon the practice. Later Iterations While at a new school, my commitment to personalized learning has remained, but my approach has evolved. Before I distribute my syllabus at the beginning of the year, I have students reflect on their proficiency with our competencies. I then ask students to set some SMART goals and to imagine some scenarios they might experience throughout the year and what questions that might lead to. Everyone has their own scenarios based on their own goals, attitudes, and abilities. I then distribute the syllabus to my students and ask them to look it over and see if they can find answers to their questions. If they can’t find answers, I help them to do so, and if the answer to a relevant question is not provided, then I thank them for the feedback and make the needed adjustment. This is a tone setting experience for the students. We follow this up with completing Student Directed Learning Contracts–I do a new one each year as well. Here students review their competency-based self-assessment and create SMART goals to take their strengths further and to improve upon weaknesses. They then build a plan for their completion and give a clear picture of how they will define success. I then give feedback–ideally conferencing–on their draft, and they make the needed adjustments. I then allocate 20-25 percent of class time each week to Student-Directed Learning Time, where students use a Student Directed Learning Planner to help them work towards what we agreed upon in their learning contracts. Adding a competency-based approach has really helped. It bridges the initial conductor metaphor divide, creating a student-centered balance between clearly defined outcomes (discipline) and personalization. In the construction of those competencies, it has been helpful to follow two rules: start each statement with “I can” and avoid compounds. The latter is hard, but it is a good rule to try to follow, keeping the focus on the degree to which the student can perform the precise outcome. Expect Pushback. Know Your Talking Points. “Why are you experimenting with my kids?” was a question we received early on. As personalized approaches are becoming more common, we get that less, but it is a good reminder to try to get ahead of conversations and be mindful of the words that you are using. You might be excited to experiment, but would you want your kid experimented upon? Instead, try starting with a clear and observable problem, and how you this approach will address that problem. Remind parents that grades are important, but that their child is much more than a number, and you’d like to help them grow in a more holistic way without sacrificing the mastery of content or the development of skills. A Boon to Self-Directed PD With this approach, particularly if you have a competency-based focus, you very quickly see the gaps in your practice. When I saw myself as master of content, I felt I was only a few classics away from where I wanted to be. This has opened the floodgates–you should see how many tabs I currently have open. But I enjoy it, and with a solid framework in place, I can add and subtract easily. A recent go-to for me has been Learning and the Brain Conferences. Creating GPTs to Support If you are not yet on a frontier model, you are not experiencing this tool in its full capacity. If you are intimidated with creating GPTs, don’t be. Start small. Try creating one to help you with something that is fun. A few iterations and you’ll start to get the picture, and you’ll start to see where you can apply it professionally. Imagine, for example, feeding your syllabus and course documents to a GPT you built for class. Students could ask the needed question whenever they needed to and receive the needed. Or imagine a student getting help with one of your competency strands, say “attention.” The student is prompted to assess their current level of mastery. The student says “progressing.” The student is then prompted to reflect on their struggles, eventually exploring the habit loop, being assisted in creating a SMART goal, which they could then incorporate into their learning contract. Or say there is a long PDF you don’t have time to read, but you do have time to ask it a few questions. You could delegate the ‘responsibility of ‘learning’ the material to an ‘assistant’ so that your own learning is more interactive. And if your assistant is not performing to your liking, you can give it an update. And while this may make some of you cringe, and perhaps rightly so, giving your GPTs personality traits has been shown to improve their performance. Update on Tools Ethan Mollick has recently shared three new GPTs. I’ve enjoyed using his “negotiation simulator,” and I look forward to exploring these. I also highly recommend the podcast “Beyond the Prompt” by Jeremy Utley and Henrik Werdelin. Final Thoughts Taking the first leap into personalized learning is scary, and I am grateful to have had administrators who pushed me and provided a parachute to help break my fall. I now have confidence in the approach to take my own leaps, which better serves my students as they prepare to take theirs. Next week we’ll cover assessing and grading with a competency-based approach and how GenAI might be leveraged to support. How GenAI Might Support the Evolving Role of the Teacher in a Personalized ClassroomI was fresh out of the Peace Corps, having served as an education volunteer in Madagascar, when I took my first teaching interview at a middle school in Brooklyn, New York. I was navigating reverse culture shock and an abundance of purposeful energy as I prepared to start my professional teaching career. The principal looked at me and looked at my resume, sizing me up. He put the paper aside and said, “This looks good, kid, but let me ask you something: what are you going to do the first time a kid tells you to go…yourself?” It was a different time. And it reminds me how long I’ve been in this profession. But what I remember even more than that question was what he said just afterwards: “Let me tell you something I tell all the new people: There’s no panacea in education.” I didn’t understand what he meant at that moment. I was fresh out of university and further enlightened by my time abroad. I was a master of content and pretty sure that I was the panacea that education needed, the embodiment of best practice. I was John Keating meets Jaime Escalante meets Mark Thackeray, the sage on the stage who was going to save ‘em all with his voice and a piece of chalk. I learned a lot that year. I learned that what worked second period didn’t work in seventh. That student X could have a great day on Monday, but be a disaster on Thursday. That student Y was awesome when we did creative writing, but would completely shut down when we reviewed grammar. And I learned that my principal was right, that despite what my resume said, I didn’t know what to do when student Z told me to… Thankfully, my arrogance tempered a bit. I visited classrooms. I learned to ask questions. I saw a lot of what worked, and I saw a lot of what didn’t work. At that time, we didn’t really talk about impacts. Towards the end of my time there, “value added” had crept in via the No Child Left Behind Act, which was measured solely on performance gains on state exams from one year to the next. Day to day, though, we talked about what “worked,” and that meant that your class was safe and orderly, that most of your students liked being in your class, and that they performed better than average on the state exams. I managed all three parts, so I was a good teacher. I was entertaining, and I cared, and I did a couple of things differently. For the most part, though, I chalked and talked, weaving my way through progressively-minded desk clusters. Kids had their own independent reading books that they chose from a classroom library, and they wrote their own creative pieces from a list of options I gave them, and we did an open mic on Fridays where they could share what they’d written. The Open Mic evolved over time, and we incorporated music and stand-up pieces, and we had a rotating MC, and the students took the initiative to invite their peers and other teachers while I mostly sat on the sidelines. Looking back, that was the best part of the week, and I don’t know why it never dawned on me that more of the week could be like that. We aimed where we aimed, and that was as far as we reached. Thinking back at those early classes, 30-40 kids in each, different capabilities, I think about how I would design that course today. Despite being a “good” teacher who did what “worked,” kids slipped through the cracks as I patted myself on the back for having above average test scores. In focusing on the aggregate, offering a solid median product, I missed opportunities with individual students. I didn’t realize that until I saw how impactful personalized learning experiences could be. When you aim differently, the places you reach can surprise you. Defining Terms: Because there have been simultaneous movements to address this common concern, the terms can overlap in places and are sometimes misapplied, so I’l outline them here to avoid confusion. Individualized Learning (IL) Students progress at their own pace with a set curriculum towards common objectives. Differentiated Learning (DL) Students progress at their own pace with a differentiated curriculum towards differentiated objectives based on their needs and abilities. Personalized Learning (PL) A paradigm shift where the learning experience of each student is central to the creation of objectives and curriculum. Universal Design for Learning (UDL) A framework that guides the creation of personalized learning which ensures that all students can access, engage, and express. Competency-Based Learning (CBL) An approach where students work towards mastery of competencies, which are defined as knowledge, skills, abilities, attitudes, and behaviors. As Audrey Watters, “The Cassandra of the Edtech Movement,” points out, personalized learning has been with us for much longer than the phrase has been popular. My students’ open mic was personalized learning before I knew it by name. Every Friday, one day out of five, my students were at the center of what we were doing. For the other four, it was content and curriculum in preparation for the state exam. It “worked”, but in hindsight the balance feels off. Impactful Learning and How GenAI Might Assist: In later articles I’ll go into greater detail of what personalized learning looked like in three very different schools in three very different settings. For now I’ll offer that while it can be a bit frustrating having to recreate the wheel in different environments, it’s also a meaningful challenge to try to get these balances right within different contexts, such that the needs of each student are met. It also keeps me on my toes, keeps me collaborative, and keeps me creative, constantly seeking ways in which to be more impactful. Shifting from what “worked” back in my Brooklyn days to what I have evidence of being impactful today, John Hattie’s research on feedback and reflection stands out. Feedback takes a lot of time, though, and students need help with reflection. General use of GenAI tools and more focussed usage of GPTs can help with both, but they need to be incorporated thoughtfully, enhancing what we already do, not replacing it. My first GPT, “Virtual Mr. Pultz,” continues to offer promising returns. My students like the immediate feedback. Students who don’t regularly seek me out in class or during office hours have reported feeling comfortable interacting with the tool at home. I do not plan on giving up written or face-to-face feedback anytime soon, but so far this tool seems to be improving the access that my students have to more immediate feedback. To assist students with reflection, I built another GPT that incorporates our competency rubrics, our Student Directed Learning Contract, our Student Directed Learning Time Planner, and our Habit Intervention Form. The tool asks students which competency strand they would like help with. It then asks the student to identify specific instances where the issue presents itself. Once both parties seem to identify what cues the unproductive habit, the tool then takes the student through a habit loop reflection. From there the student is invited to create a new SMART goal for their SDL contract and to share their SDL plan for working towards that goal. I still want eyes on this process, but, again, these conversations increase access to a reflection “partner.” In addition to feedback and reflection, advances in brain-based learning have shed light on some of the nuances of student engagement and how that can be impactful. While I’m sure it won’t surprise educators to learn that engaged students perform better, they might be interested to learn that students are most engaged when they are emotionally connected to a task or when they are experiencing something novel. With this in mind, something that I’ve begun experimenting a bit more with is the construction of simulations. Ethan Mollick and Lilach Mollick put out a great article that showcases how they have built effective prompts. I’ve had my students work with their negotiating tool, the prompt of which is graciously shared in the article. I’ve also modified that prompt to be a cross-cultural scenario builder. In the former, my students gain a novel way to practice their English speaking skills while making cross-curricular connections. In the latter, they get to simulate a real-world encounter that challenges them to consider alternative perspectives, which, depending on what they input, can often lead to an emotionally charged experience. At the end of each interaction, students again receive feedback on their performance, and they are invited to try another simulation. While I’ve been excited to experiment with some of these tools, I have been a bit frustrated with the uneven rollout towards access of frontier GenAI models. As this has been slower in some parts of the world, many of my students still do not yet have access to the strongest version of this tool, and some of those who do have access, have reported losing it after only a few interactions. To mitigate this, I’ve shared the GPT prompts with students so that they can use them in the 3.5 model. I’ve also tried gamifying the quick loss of access by making a contest of who can prompt for the best feedback within a certain amount of interactions. Hopefully this access will improve over time, but it is an area to keep in mind when offering these tools to our students. Final Thoughts: Like many others have stated, I do feel that GenAI will be impactful on education, but not in the way that some edtech companies are touting. Instead, I feel GenAI is best suited towards supporting personalized learning within existing frameworks such as UDL. In the case of UDL, the aim is to design experiences that increase student access to learning while enhancing both their engagement and expression. Adding a competency-based approach to that and supporting these initiatives with thoughtful GenAI usage, and we might be onto something. In my own practice, first and foremost, I am and always will be seeking to build relationships with each of my students through a thoughtfully constructed curriculum that balances their own curiosity and passion for learning with the expectations of the particular context I’m working within. It’s a balance, and it’s one I’m constantly tinkering with, and I just got a shiny new set of tools to play with. But any time I get a little too excited, I think back to that first interview twenty-four years ago, and I’m reminded of my principal’s words: “There’s no panacea in education.” He was right then, and he’s right today, but that doesn’t mean we shouldn’t aim a bit higher. Next, week we’ll explore what it means to take a leap of faith with personalized learning—moving from theoretical potential to practical application. Integrating Generative AI into Personalized, Competency-Based Learning SettingsNow that frontier models of generative AI are no longer stuck behind a paywall, how do we effectively integrate these tools into the flow of our teaching and student learning? While I’d be skeptical of anyone claiming to have a concrete answer on this, it is possible that we already have some promising ground to build upon. This week I launched two new LLM tools in class to complement and enhance the personalized, competency-based learning initiatives that I’ve been developing with my students over the course of the past 12 years. Both tools are offering promising returns, and neither would have been imaginable to me a year ago. At the same time, I’m not sure if I’d be as open to experimentation and early implementation if personalized and competency-based learning hadn’t already pushed me to reimagine my role in the classroom and the opportunities it afforded student learning. Finally upgrading to ChatGPT4 last month, I excitedly began to explore how I could build my own GPTs to support my students. I was impressed, but frustrated. No matter what I built, I was going to face an accessibility problem as the full benefits of the tools and prompts I was experimenting with would be stuck behind a paywall that many of my students would be stuck behind. However, with ChatGPT’s recent announcement that the free model would be upgraded to 4o, this is no longer the case, or at least it won’t be for much longer. With this universal access in mind, I identified two immediate problems I am having. First, my students need more one-on-one writing instruction, and there’s only so much of me that I can make available in class and office hours. I like grading some of my student writing, but not all of it, and not all the time. Every new assignment I give means more work for me to do outside of teaching hours. I could suggest students to just ask an LLM for help, but as Ethan Mollick has warned, that often leads students down the road of temptation, where they can ‘offer evidence’ of achieving outcomes without having to go through the necessary steps to properly develop the skills to achieve those outcomes on their own. I remind my students often that this is a sure path towards being made obsolete in this coming age. The second problem relates to a complicated oral commentary that my IB Language and Literature students have to perform. For this, they are not allowed to “rehearse” with me. They already performed a mock, and they received feedback on this, but this is a completely new task with new texts to comment on, and outside of an initial outline that I am allowed to give feedback on, they are on their own. Two New Tools For the first problem, I built a GPT called “Virtual Mr. Pultz,” a sarcastic and helpful writing coach with a Captain Nemo scowl embedded on a bust of a cartoonish bald ginger in a tweed jacket and tie with question marks floating in the background. It's the WWE heel character that I never got to opportunity to play. Virtual Mr. Pultz starts by reminding students that “Test day is a rest day for the well prepared. How can I help you prepare?” He then asks for students to share the assignment and rubric and prompts students to share what they think they did well and what they think they are struggling with. It then asks to see their work and gives them feedback according to the rubric. It uses the coded language that I use when grading to identify weaknesses so that students can access the resources on our LMS. Virtual Mr. Pultz is able to quite reliably identify those errors and offers a quick explanation with examples without correcting the initial mistake, though occasionally it does. It prompts students to do their own revisions and will then give further feedback. It also reminds them to review the resources I’ve provided on our LMS and to explore the opportunities to master those skills on Khan Academy, routines that I had already built into class independent of this new tool. Once Virtual Mr. Pultz helps students identify their target areas for improvement, it prompts them to create a specific plan to make the needed improvement. It has been instructed to be encouraging to genuine questions and issues, but if the student responds lazily or disagreeably, Virtual Mr. Pultz employs the same rule posted in the real Mr. Pultz’s classroom: “Warning: Whining and general disagreeableness will be met with strong sarcasm.” Virtual Mr. Pultz has so far communicated to said laziness that “Test day is a rest day for the well prepared, but perhaps you prefer suffering.” Upon further whingeing, it quipped, “Suck it up, buttercup.” Here is a link to the prompt I used to create it. The second LLM tool I offered is a prompt that makes use of an existing GPT: IB Lang/Lit. While I’m still exploring its capabilities, I have had some strong early returns on students practicing their Individual oral commentary. For this assessment, students need to identify a global issue, which is defined as an issue of significance that is experienced both locally and globally. The students need to speak for 10 minutes on this global issue and how it is seen in a literary text and the larger body of work of the author and a non-literary text and its larger body of work. The prompt has the AI assessor wait until the student says “thank you” before speaking. At that point, it generates five total questions, one-at-a-time, for students to respond to. I’ve attached the prompt here for you to explore. After this, it assesses students on the rubric that is provided, tells students what they did well, and gives them suggestions for improvement. The grading is a work in progress, but the question generation and feedback features are already working well. In the case of both LLM tools, there will be a lot of trial and error and tweaking along the way. But in sharing these tools with my students with proper guidelines and expectations, as imperfect as they may be, an entrepreneurial mindset is modeled, one that opens many of them to exciting possibilities of what they might build on their own, leveraging this technology to support their own learning and to further explore their own passions. Bigger Picture While it might look like these opportunities aim at putting teachers out of business or passing the buck to a machine, I want to be clear that my aim is not to replace teachers or what teachers do. I am looking instead to enhance effective practice. With that in mind, over the next five weeks, I’d like to openly reflect on a 12-year journey that I’ve taken with personalized learning in three very different school cultures that was further bolstered by a competency-based approach introduced to me five years ago. I am emphasizing personalized, competency-based learning because I believe it is the best holistic approach to serve our students and their modern sensibilities. It has the potential to bridge traditional practices with the skills needed to be competitive in the modern workplace while utilizing our existing technological capabilities. This article is the introduction to the series. Each week I aim to reflect on different parts of the experience, highlighting unresolved issues and imagining how LLMs might assist this work in the future. I'll also offer a brief update on how the new tools are working and other initiatives I'm experimenting with. Listed below are the weekly focuses along with a brief description. I welcome feedback on these reflections, and I hope that it generates useful discussion. Next week’s article will focus on the evolving role of the teacher. It will be released next Thursday. Evolving Role of the Teacher in a Personalized Classroom and How GenAI might Support In a classroom driven by personalized learning, the role of the teacher shifts to varying degrees to that of facilitator, curator, mentor, and coach. That doesn’t mean the sage has to completely give up the stage, but that stage certainly needs to be shared. This section will explore this evolving role and how AI tools can support this transition. Taking the Leap of Faith into Personalized Learning I was lucky enough to have an early experience where the school I worked at took a collective leap of faith, reducing classroom contact time with students and finding alternative curricular paths to achieve outcomes. I’ll share what that felt like, warts and all, and how it evolved over time. This section will provide practical first steps for educators looking to make personalized learning work and how incorporating LLMs can support. Assessing and Grading and Leveraging GenAI to Support With personalized learning, traditional grading systems are often inadequate. This part will discuss more dynamic assessment and grading practices and how LLMs can help. It will span grade-heavy systems to more progressive mastery systems. Adapting Personalized, Competency-Based Learning Across Different Systems This section will draw on my experiences working within various educational systems—from a school that collectively embraced this approach to one with more of a suitcase curriculum, to a much more rigid state system. I'll share insights on how personalized learning can be adapted and implemented in different institutional settings, each with its own challenges and opportunities. Giving Students the Space and Structure to Surprise You Students often exceed our expectations when given the tools and opportunities to direct their own learning. This concluding section will highlight some inspiring examples of student achievements that were made possible through personalized learning. It will also share some early successes that students are having while collaborating with LLMs. A Few Thoughts on Generative AI in Education from a Recent LudditeOver the weekend, I had the pleasure of presenting at the AI for Education summit. My talk was titled “Keeping the Human in Human-Experience Design,” and I focussed on some of the work that I am doing in my English classes and Design Thinking Club as I try to integrate responsible generative AI experiences into my students’ learning. It feels strange to be on the early adaptor side of this conversation because for most of my teaching career I’ve been demonstrably low-tech. I was one of the last of my friends to buy a cellphone, I’m not on most social media, I only recently got a smartphone, and I once gave a PD session in defense of paper and pen. But when Chat-GPT3 launched, something felt very different to me. Like many, I was amazed at what a word predicting tool could conjure up, and then I was horrified to see that it could competently complete a good number of the writing assignments I gave. But then I returned to amazement at how it could use backwards planning to give me a solid work schedule, how it could create diverse personas to give feedback on a design proposal, and how it could help me outline the plot beats for a screenplay I was working on. And then I started feeling overwhelmed by both the possibilities and the number of AI tools that were entering the market, the prompt libraries, the ethical considerations. I wondered too whether in my excitement over what AI could do, whether I was losing sight over what was actually in the best interest of my students’ learning. And then I discovered that I was in good company with these feelings because as far as AI in education goes, there are no experts. As ChatGPT-3 was released just before my daughter was born, I’ll be watching them grow up together. It’s birth offers a schism to my work life that she offers to my family life. And for those of you with kids, you know full well that everything changes. Sometimes while rocking her back to sleep after a bad dream, I’m reminded that ChatGPT will be the lowest form of technology she will ever interact with, and then it’s my turn for a bad dream, trying to puzzle together what her life will be like, and then I give a chuckle at my seven-years-old self who was once mesmerized by the capability to manipulate crude white pixels on a screen the first time I played Atari’s Pong. What must my parents have thought as they watched me practice such sorcery? While we are still in the early stages of integrating this new technology into our workflows and lives, I thought I’d share a few observations I’ve made from early adoption. Large Language Models (LLMs) like ChatGPT, Gemini, and Claude offer vast opportunities for enhancing learning, but they also present unique challenges that need careful consideration. To address these challenges, I’d like to offer two problems that I’ve encountered and three solutions that I am currently working with. Problem 1: Attitudes Towards Generative AI Many people don’t really understand what this technology can do, nor do they understand what they can do with it. Part of the problem with this misunderstanding is the affordances that the tool suggests. It is such a general tool, and it looks like Google, so it is easy to just treat it like a search engine and dismiss it the first time an inadequate or hallucinatory outcome is received. Conversely, the magic of those initial outcomes is problematic because people tend to accept those initial outcomes simply because they are created much faster than they could create them themselves. Those early magical returns can be tempting to submit for assignments, particularly if the individual lacks the skill or motivation to actually produce the given outcome. Additionally, the quality and speed of the outcomes can demotivate the acquisition of the given skill because if a tool can do this for me, why should I bother learning how to do it myself? Lastly, when people use the tool, they are reluctant to admit doing so for fear of the attached stigmas. If I am using the tool to produce outcomes I did not produce myself, then that must mean that I do not have the ability to produce said outcomes. Also, since we are still wedded to the idea that good work must take a considerable amount of time, the fact that this tool allows us in many cases to work faster might create the impression to others that we are not working hard enough. Problem 2: Human Collaboration and AI Another significant issue is the impact of LLMs on collaboration. When individuals interact more with screens than with each other, the essence of teamwork is lost. The focus shifts from engaging with peers to interfacing with a screen, potentially stunting the development of crucial interpersonal skills. As the main purpose of project-based learning is meant to be process, not product, carelessly outsourcing that work to LLMs would mean the loss of the soft skills that are normally developed during collective problem solving. Additionally, the quick returns that LLMs can offer lead groups to settle on initial solutions without going further. And it’s not just students who are prone to this; a recent study out of Stanford University showed that professional designers also succumb to this trap. As two-time individual recipient of the Nobel Prize, Linus Pauling, argues that “the best way to have a good idea is to have lots of ideas,” this is a particularly alarming observation. Solution One: Demystifying AI To combat these challenges, a step recommended by Stanford Design School’s Jeremy Utley is to have an emotional, interactive conversation with the tool. I’ve demonstrated conversations I’ve had with ChatGPT to my students that have really impressed me. I’m further impressed by the capabilities of ChatGPT-4, particularly the whisper function that allows me to record a rant or have a conversation about a half-baked idea that I would normally make my wife suffer through. Whatever I communicate is recorded, and then I can have my rambling thoughts reorganized into something more coherent to explore later on. Seeking solutions for our daughter’s education, I can have ChatGPT take on the role of three different personas asking me questions to help me better understand our circumstances. And while having interactive conversations like this have their own rewards, the greater value is the epiphany that comes from discovering some of the capabilities of this tool. If it helped me better understand the circumstances of my daughter’s education, might it be able to help me brainstorm ideas for next week’s research project, give me a novel hook for tomorrow’s lesson, or break me out of a creative rut for tonight’s dinner. Perhaps a useful way of thinking of Generative AI is as a precocious intern who can take on various and multiple personas and who is always available, but whose work needs to be checked. Solution Two: The FIXIT Framework Focussed Individual ConteXt Interactive Conversation Team Adopting the FIXIT framework can help reinforce new habits with this new technology. Individuals start with a focussed problem or interest. They then work individually on that focus. This is to ensure that their unique, authentic and human perspective does not get lost in the dynamics of AI or group collaboration. After the initial individual work, they then provide the necessary context to the generative AI tool. For this, there are numerous resources to share (here’s one). Alternatively, students can experiment with a trial-and-error approach. Once sufficient context is established, they then share their individual work with the LLM and engage in AI-assisted ideation through interactive conversation in order to enhance their ideas. Following this, they take their findings to their team. And, as is the case with the design cycle, the process repeats. This method encourages individuals to engage deeply with the material on their own before leveraging AI into their own work and then offering that to the team. It fosters a balance between independent critical thinking, AI-assisted critical thinking, and collaborative thinking and innovation. Solution Three: Encouraging Exploration People need to play with the tool and discover both what it can do and what they can do with it. As such, I’ve introduced a weekly prompt-generating competition that acts as "show-and-tell". Students are encouraged to use the FIXIT framework in their approach. It is low-stakes fun that allows for student exploration, and because it is self-directed, students take ownership of the experience without being held back by their teacher. Continuing the Conversation To me, the rapid advancements in generative AI that I navigate in my working life parallel the milestones my growing daughter introduces in our family life. Both scenarios are filled with excitement and anxiety, and both force us to continuously adapt. Just as my daughter shows us that no barrier is too high and no object out of reach enough, we should recognize that AI will similarly stretch the boundaries of what we thought possible. In the case of our daughter, my wife and I strive to foster curiosity, confidence, and independence, balancing our own anxieties with the need to let her explore safely. I believe we must approach AI with a similar mindset: embracing its potential while cautiously navigating its challenges. As we continue to refine our position and policies that will infirm our practices with generative AI in the classroom, I invite feedback and questions on these reflections. I also encourage you to conduct your own experiments and to explore. Let’s engage in meaningful conversations about how we can integrate this technology into our teaching, ensuring it enhances student learning in a balanced and thoughtful manner. If you are interested in learning more, I recommend the following resources: -How to FIXIT -Don’t let GenAI Limit Your Team’s Creativity–Harvard Business Review -Beyond the Prompt Podcast–Available on Apple, Spotify, and other Podcast Platforms -A Suggested Rubric to Communicate AI Expectations on a Given Assignment -Try Ten–An AI-assisted Ideation Tool in 5-minute Daily Exercises -Ethan Mollick and Lilach Mollick’s Paper on how to leverage AI tools into personalized learning experiences for students |
AuthorWrite something about yourself. No need to be fancy, just an overview. Archives
July 2024
Categories |