This past week, I was fortunate enough to be in Vancouver to attend FISA 2016, the education conference for independent schools across British Columbia. Although my work is in the public system, I was keenly interested to learn from a number of the speakers in the lineup, including Daniel Pink, Yong Zhao, and Dr. Charles Fadel from Harvard. There were a number of pieces that resonated with me from each of the speakers, but a couple of shots in particular hit me squarely between the eyes, especially in light of my work in helping to implement BC's new competency-based curriculum across our district.
Last year, I got into a spirited conversation with a colleague about "kids these days": she was expressing her frustration with the new curriculum. She felt like that by focusing on what she saw as some of the broader, more ethereal concepts like personal identity and creative thinking, we were losing the rigor of the content standards that our current curriculum requires students to learn. "Whether kids like it or not, there are some facts that they just need to know!". Without trying to be too obtuse, I asked her "So which facts are the ones that kids 'need to know'?". And moreover, given the diversity of our schools, our learners and their individual backgrounds, I asked her how we were supposed to determine which facts were 'the right ones'. Exasperated, she laughed and said:
"I just never want to be sitting in my doctor's office and have him need to look something up on Google!"
And while I know that she was using this specific example to make a general statement that students can't simply depend on Google, I have thought about our conversation for a long time since. As a point of interest, I have subsequently asked numerous doctors (including specialists such as a radiologist and an internalist) if they had ever used Google (or some sort of search engine/online tool/connection) to help them in their job as a physician. And while I am certainly not Gallup, I can say without equivocation that every one of them said 'yes'. And usually not just 'yes', more often it was 'absolutely!'.
During his compelling talk that took the FISA audience into the future of artificial intelliegence and curriculum re-design, Charles Fadel made a statement that underscored my thoughts. He said:
"Would you rather take the chance that your oncologist has read the ten thousand plus articles on your particular cancer, or would you rather that they are working with an AI assistant that actually has read and summarized the knowledge from those ten thousand articles?"
In the outstanding video "Future Learning" (please take 12 minutes to watch it), Sugata Mitra talks about a curriculum that he would write to best prepare students in K-12 for life beyond 2030. It only had three parts: reading comprehension, search and retrieval skills, and the ability to believe, and therefore 'avoid doctrine'.
In the context of Charles Fadel's question, I want my doctor to have incredible reading and comprehension skills, to be able to have outstanding search and retrieval skills to enable he or she to be as informed with current research and techniques as is humanly and 'inhumanly' (with technology) possible, and to be able to avoid doctrine and believe, so they are able to determine what is real, and what is nonsense, as Sugata Mitra says.
So, would you want your doctor to use Google? I would.
In order for your doctor to understand incredibly dense medical articles, isn't a basic (and maybe not so basic) understanding of the fundamentals of bilogy necessary?
ReplyDeleteI recently introduced a physicist working at a major university to my class. I thought it might be interesting to give him a grade 12 physics quiz. He rattled of answers instantly because understanding first principles is essential to understanding the level at which scientists perform their job today.