31 May 2016

Understanding the TOK Presentation Rubric


The holistic TOK Presentation rubric asks one main question:

Do(es) the presenter(s) succeed in showing how TOK concepts can have practical application? 

However, if you are a student, what does this mean in practice? Last year I began teaching the rubric as having three main parts, which has shown to be helpful. A great presentation will take into account all three of these main parts.

Just because we can break down the Presentation Rubric into 3 parts does not mean these parts are weighted equally. The Exploration of the Knowledge Question should be seen as the "meat" of the TOK Presentation sandwich, with the Knowledge Question formulation and the Outcomes of Analysis serving as the bread. The better the ingredients of the middle of the sandwich are, the better the sandwich is overall.

The 3 Main Parts of the TOK Presentation Rubric

1. Knowledge Question formulation

Formulating a good Knowledge Question is the first step in a successful TOK Presentation. A TOK Presentation KQ needs to be directly derived from the RLS. It might be helpful to look at the "Think TOK Model" developed by OUP to understand the formulation process.
Understanding the Level 5 descriptors for #1:
  • Well-formulated means it is 2nd order, focused on TOK concepts, and can be applied generally to more than the RLS in question
  • Clearly connected to a specified RLS means that you can clearly explain the development of the KQ from the RLS by using an understanding of knowledge claims
TK/PPD Application
"Explain the connection between your real-life situation and your knowledge question:"
  • Students should be able to explain how they got from their RLS to their stated KQ by explaining their progression from:
    • RLS to isolated 1st Order Knowledge Claim
    • move from 1st Order KC to 2nd Order KC by replacing 1st order language with 2nd order language using the "tools and language of TOK"
    • Creation of KQ from extracted 2nd Order KC and explain why the KQ is important to investigate with reference to the RLS.
Student Check
Create a quick 2 minute prototype of your RLS to KQ using this Stage 1 Planner. Then say it out loud to an audience to get feedback.

2. Exploration of Knowledge Question

Exploring the Knowledge Question is the heart of the TOK Presentation. The exploration is the main way to show how TOK concepts have practical application. Exploration in the presentation should mainly stay in the 2nd Order world, connecting the RLS and KQ using the tools and language of TOK. The exploration should help create and understanding of the KQ by developing 2nd order knowledge claims that use the TOK framework selected by the student(s). To understand this exploration phase a bit better, we can look at the official IB deconstruction of an exemplar TOK Presentation.

Understanding the Level 5 descriptors for #2:
  • Effective exploration means that the analysis is based within the framework of the Ways of Knowing, Areas of Knowledge, and Personal vs. Shared Knowledge, that we use in TOK
  • Convincing argumentation means that the exploration is based on an insightful, well-thought out and understandable reasoning process
  • Investigation of different perspectives means that you have taken into account how other knower(s) might view the analysis on the KQ itself, not merely different perspectives within the KQ.  This can be done in many ways; usually a consideration of other TOK concepts is helpful (for example: belief, bias, certainty, culture, evidence, experience, explanation, interpretation, intuition, justification, limitations, reliability, subjectivity, truth, values)
TK/PPD Application
"Outline how you intend to develop your TOK Presentation in the context of your real-life situation. Include analysis..."
  • A TOK Analysis an exploration of a knowledge question that creates an argument for a (possible) conclusion to the knowledge question. This is the heart of the TOK Presentation—showing how TOK has practical application.
  • Student need to develop and document a TOK Analysis of their stated KQ, along with creating at least one or more different perspectives on their analysis. As a general guideline:
    • For groups of 1, the student should create 1 TOK Analysis and different perspective
    • For groups of 2, students should create at least 2 TOK Analyses and at least 1 different perspective
    • For groups of 3, student should create at least 3 TOK Analyses and at least 2 different perspectives
Student Check
Practice creating a TOK Analysis using this handout. Then create an outline of your work using this Stage 2 Planner.

3. Outcomes of Analysis

The outcomes are the "conclusions" about how to "answer" the KQ. This is the main reason why there is a presentation--because there are insights to share with others. An outcome is a 2nd order claim rooted in the chosen TOK framework by the students(s). These should be clearly stated and connect directly back to the real-life situation. If we look back above to the exemplar, the outcomes of the analysis was that scientific prediction as a validation tool has both strengths and weaknesses, which were then applied back to the RLS in question, and to others in both the Natural and Human Sciences. 

One of my better TOK Presentations last year focused on the KQ: In what ways is knowledge dependent on language. Their main outcome (2nd order knowledge claim) was that knowledge is dependent on language in many ways, but that we can communicate our knowledge in more ways than just through language, for example with art and emotion. This proved to be an excellent way to both understand the KQ and to apply the outcome more generally to other RLSs. 


Understanding the Level 5 descriptors for #3:
  • Showing significance to the RLS in question and to others means that the outcomes are compelling, that they do influence the way knowledge is created, and that these conclusions can also be applied to other non-connected RLSs
TK/PPD Application
"Show the significance of your conclusions with particular reference to your real-life situation and indicate how those conclusions might be relevant to other real-life situations"
  • The results of your TOK Analysis should be clear, 2nd order knowledge claims. These 2nd order knowledge claim should be directly connected back to the original real-life situation in a way that helps us "knowers" understand the knowledge question better, and might serve as a possible "answer" to the main knowledge question. 
  • Showing relevance to other RLSs means that the conclusions reached in the TOK Analysis are not limited to just one RLS, but that they have a wider reach, and can give insights into how knowledge is acquired and/or constructed in general.
    • The simplest way to do this is to show how the conclusion can be applied to other RLSs that are not connected to the primary RLS.
Student Check
You should be able to easily state to anyone who asks:

  • what your conclusion is, 
  • how and why it matters to helping you "answer" your Knowledge Question, and 
  • the conclusions impact on your identified RLS. 

.   .   .


Dividing the TOK Presentation Rubric up into these three main parts should help both the teacher and the presenter(s) create superstar worthy TOK Presentations.

Resources:


16 May 2016

School Evaluation Has An Improvement Problem

photo credit: Neil
Last week we had our accreditation visit. 

The lead facilitator said two things to me, as a learner, that I believe he meant as compliments: 1) that I am a button pusher, and that we (education) needs more people like me, and 2) to keep being an agitator.

I believe we (as educators) also believe those to be good dispositions for our students to have. Why? Because in our ever changing world we want them to be original challengers of the status quo.

But how do we, as teachers, administrators, and educators, usually model ourselves as learners? Take, for example, an accreditation process. We spend quite a bit of time looking at the past and present—the status quo. But why do we only spend time on defending old stuff rather than also trying to figure out new stuff—to be challengers?

What do we believe drives deep learning? What do we want our kids to do? Then think about what we practice as learners ourselves.

I think the goal of a good accreditation process is to be program agnostic and learning specific. When I asked about the role creative disruption and experimentation should have in a school evaluation process, one of the facilitators agreed and talked about needing to find a balance between the past, present, and future. I completely agree. But the evaluation model has only structured spaces for the past and present. 

Where are our spaces in evaluation for the type of learning we value in our students? Why don't we put some of our beliefs into the curriculum (8th ed. School Improvement through Accreditation) we use to evaluate and measure learning? Where is room for the practitioners to use creativity, experimentation, innovation?

The current school accreditation model, in the words of Russ Ackoff, structure efficiency within the current model. By using standards, and indicators, and closed modeling, they chiefly look for improvements within discrete areas (Sections A-G) within discrete time frames (12 month self-study, 5 year review cycle). But they do not structure conversations around what Ackoff calls creative discontinuity; structuring creativity to allow for the "seeing of realities that might otherwise be overlooked?" Why should school wait to transform learning until after the accreditation has checked off the standards that have been scrupulously checked by the siloed self-study committees by generating volumes of evidence?

There is no balance. 

As I think the facilitators would agree, the Accreditation Standards and Indicators are descriptors, not prescriptions. So, combining two things I have been thinking recently about, evaluation and design, I've gone through the standards and picked out ones that can be aligned with the goals of discontinuous improvement into my own category, Category H: Transforming Learning:
  • A3f …. the acquisition and refinement of the skills of leading and following, collaborating, adapting to the ideas of others, constructive problem-solving, and conflict-resolution through experiencing leadership in authentic contexts. 
  • B6b: Teachers create stimulating learning environments that are evidenced by students who are engaged and active participants in their learning 
  • B9b: The school encourages pilot curriculum innovations and exploration of new teaching strategies, monitored by appropriate assessment techniques. 
  • D2a: Teachers utilize methods and practices which are consistent with the school’s Guiding Statements and which inspire, encourage and challenge students to reach their full potential.
  • F2e: The school creates student learning opportunities by effectively using the skills of its own community members and by building partnerships with external agencies such as local businesses and professional organizations.
  • F3a: The development and delivery of the school’s complementary programs demonstrate sensitivity to the needs and beliefs of different cultures, foster engagement with the local culture and promote global citizenship.
  • F3b: The school actively supports the development of student leadership and encourages students to undertake service learning. 
  • F3c: The school actively promotes and models global environmental awareness and responsibility across its community. 
  • F3d: The school regularly evaluates its complementary programs to ensure they remain aligned with its Guiding Statements, meet student needs and interests, and foster global citizenship.
Taken together, these indicators represent a very different idea of learning than is found in almost any traditional school that has not already begun to transform its learning DNA. They should shake the business-as-usual approach to learning. Does current school evaluation shake our approach?

Here is the argument. It is not that current evaluation models do not do good; they do, especially at schools who still have foundational work to do. But this is not an either/or proposition. It is a both/and.

There is no /and.

Without structuring experimentation into what this new learning DNA might look like, most schools will never get to transforming what they believe about learning because they have never been asked to. We need to study space and structure, and time, and transdisciplinary learning.

All accreditation models should have a transformative piece in place, a place for creativity and discontinuity. A place where it is explicitly said that it is ok to study experimenting, to study innovating, to study discontinuity, to explore the unknown and come back later with what you have learned, not what you have checked off. As a learner, that is the self-study team I want to be a part of.

We can continue to assume the future will look much like the past, and build the best model we can. But how do we know we are not building the best flip-phone? Or the best combustible engine? We do not (in fact all evidence points to the fact we are). And that simple fact should force us to change. Not in 5 years, not in 12 months. But now.

Check out Part 1 here: The Backwards Design of School Evaluation

05 May 2016

Finding the Future(s) of School: Discontinuous Improvement and Scenario Planning


A couple of recent articles I have read and YouTube videos I have watched, plus a recent email conversation I had, have hit on something I have been thinking about for a few years but never had the vocabulary to talk about in any meaningful way—how can we strategically push school into an unknown future?

The first article was one Will Richardson recently wrote called "We're Trying To Do "The Wrong Thing Right" In Schools." He used some of the system thinking ideas of Russ Ackoff found in this video:


Which led me to this video of his:

Those 21 minutes of Prof. Ackoff have spurred more synthesizing about the way I see my current world (teaching and learning) than at probably any time since I was a junior in college and discussing formal vs. substantive rights for the first time in my Political Philosophy class. In particular his distinction between efficiency and effectiveness and his idea of creativity as a discontinuity—discontinuous improvement—has really created a new set of understandings for me.

Finding Ackoff lead me to start an email conversation with a leading expert in applying a type of systems thinking called Lean Manufacturing (or the Toyota Production System) to health care (he also happens to be my father-in-law). He pointed me in the direction of the second article, found in the Harvard Business Review called Living In The Futures. It chronicles the 50 year history of a small but transformative experiment at Shell Oil in using creativity and discontinuous improvement—alternative futures planning.

Simply, alternative futures planning "is about explicitly recognizing and exploring plausible ways in which the future could unfold"(link). Sounds exciting, especially if the system within which you work is old, slow, and not designed for change. Why? Because as they found at Shell,
They encouraged strategic conversations that went beyond the incremental, comfortable, and familiar progression customary in a consensus culture.

Planning for Possible Futures

From the beginning, those engaged with Shell's scenario practice maintained that scenarios are not predictions but can provide a deeper foundation of knowledge and self-awareness in approaching the future. They also felt that the "official" view of the future—the business-as-usual-outlook—both reflects an optimism bias and is based on the human tendency to see familiar patterns and be blind to the unexpected.
One thing I am not is a systems expert. Nor do I have more than a cursory, ankle-deep understanding of systems thinking, Lean methodology, and alternative futures scenario planning. But I completely intrigued by the concept of discontinuous improvement.

Because here is the riddle for me: how do you at the same time try to do what you are doing a bit better inside one system (more efficient?)—redesign, while at the same time prepare for those things moving to a different possible future/system (looking for effectiveness?)—revolution.

And this corollary: to what extent is looking for efficiencies within the current model (continuous improvement?) preventing preparing for a different possible future (discontinuous improvement?)?

For example, I believe there are things that are common to (almost) every sound learning design, and these small things can (probably) be transferable to such and such systems. However, I also believe that school or education is going through a model crisis on an order of magnitude we haven´t seen in our lifetimes. And most schools are not preparing for nor even really considering this future new system/paradigm.

That is why Ackoff resonated with me so much. If school would take a "possible futures" exercise seriously, I would be very curious to see how far they got along questioning the very model/system/paradigm they are operating within.  And if they got that far, how would they square what they are doing in the classroom right now with "what would you do right now if you could do whatever you wanted to," as Ackoff said.

Conclusion

Scenarios encourage attention to the future's openness and irreducible uncertainty.
Most schools are not designed to incorporate continuous improvement methodology, let alone discontinuous improvement methodology. If they do have some type of commitment to improvement, it is usually reduced to improving test scores.

Ackoff, Shell, et al have inspired a new round of questions for me.

  • What would futures planning look like in school?
  • What possible futures would be created in all the various contexts school operates in?
  • What does a commitment to discontinuous improvement look like in school?
  • How do we get school to institutionalize organizational creativity to allow for the "seeing of realities that might otherwise be overlooked?"
And inspired an urgency to frame new conversations. Because the conversations I am most interested in about education have moved past conversations about pedagogical strategies (important, but possibly only seeking efficiency). They are conversations about the ideas around the "what would you do if you would do anything" with the design of school, the system(s) and model(s), the spaces and structures—what looks like to me to be a conversation about true effectiveness.

The power of this conversation I think is aptly summarized by a closing quote from the HBR article:
It will help break the habit...of assuming that the future will look much like the present.

References: