The article by D’arcy Norman from the U of C was quite interesting. Norman, D. (2017, March 11).
The idea of collaboration carts piqued my interest even more. At KPU, we have some systems that are coming to the end of their life. The smartboards at each group table and the smartboard at the top of the room offer inclusion and collaboration, but as they are coming to the end of their life, some screens work. Some don’t right now in a few of the classrooms, which hinders the learning process of collaboration and inclusion.
I think with the collaboration carts; the thought was to offer flexibility and transparency, to open up access to students and all for accessibility of programs and technology, let go of control, and offer collaboration to all. To introduce this profound change, the main areas of change that were thought out were flexibility, user design control by allowing a universal design of the learning approach, having support available, and open access to all. Some barriers they encountered were technology limitations, system complexity based on each user’s need, and a warranty expiration, which I resonate with from my experience at KPU.
I would love to hear how it has progressed since 2017 and what they have learned. From my perspective, is a continuous plan in place to support upgrades and implementation? I also think about continuous training time and resources for the new technology as it grows and changes for all users. All who use this need some tenacity to get what they need if their ideas are not working or the technology changes, and they need to learn more to adapt. Maybe launching this more from a pilot perspective, gathering feedback as you grow the project, could have helped with testing and adoption or adaption and improving it as it continues to be rolled out.
Great insights, Leona! I believe a pilot program would have been an excellent way to foresee the potential issues that D’Arcy eventually identified. From the diagram, I see there’s a potential for 32 stations, which, if acquired and implemented all at once, would have carried significant risks. With a phased rollout strategy, the warranties would have expired in a staggered manner, reducing the potential impact of maintenance costs and allowing time to develop an evergreen plan.
I’d also love to see how the technology has held up since 2017, especially considering the hardware shortages during the pandemic and the shift from classroom training to online learning. Did the faculty and students feel lost without the technology, or would they say it was valuable but not essential, or more of a luxury?
Hi Leona and Allie,
You both wondered about how the TI infrastructure and planning had evolved or changed since this initial report and reflection. I tried to do a bit of digging, and there are multiple reports, but interestingly not much is said about the space itself. They very much talk about the kinds of activities and programming that they offer (and have a series of vision statements) – but described from a teaching and learning focus. As Allie points out – perhaps post-pandemic the focus has really shifted to help support blended and online learning initiatives. Would you say there has been a similar shift in focus in your organizations?
Thank you for your comments, Allie and Michelle, and for the items to consider. I didn’t consider how the pandemic may have affected this work, and that may be why we don’t see a continual report of satisfaction data (Marsh et al., 2006) to measure the results.
Adding more from our learning around data while thinking through the program that D’arcy Norman and the U of C deployed. (Norman, 2017), has brought out some additional insights.
I would like to know if the design team effectively leveraged data collection before deploying this idea. Where did they obtain the data from to decide on these solutions? Were there feedback and evaluation forms filled out or incident reports that drove their decisions ((Zettelmeyer, 2015)? According to Zettelmeyer, sometimes we neglect to collect data without considering why we are collecting it. What the team chooses to measure will only be valid if the data provides purpose to the goal. The RAND article (Marsh et al., 2006) spoke to me in its expression of how data can initiate the need for change and that data to use might include outcome data, what we want the students and faculty to achieve, Process data – what processes are involved in achieving the outcome, satisfaction data, which I do not read about in regards to evaluation and obtaining feedback since 2017. I think the U of C project may have tried to pinpoint their outcome and process data, but this could have been stronger if they had used focus groups and more evaluation methods while designing and launching a pilot. U of C needs to go back and collect satisfaction data to ensure the spending brought the outcome they sought. In saying this, some privacy concerns around collecting usage stats and experience (The Open University, 2023). should be kept private and anonymous by using theming the context rather than sharing user information.
Collecting this data could determine the effectiveness of this solution for the students and faculty (Sclater et al., 2016), the effectiveness of the data use, and evaluate the instructions on how to use the equipment (Marsh et al., 2016).
From a planning perspective, the data could support the awareness and the need for change and where those needs are compelling (Sclater et al., 2016). It could have helped to build a case for the cost of the technology for this initiative (Zettelmeyer, 2015). Remembering that leaders and decision-makers need to learn how to distinguish between good and lousy analytics and understanding where the data came from is key to judging the quality of the data received, and keeping anonymity through theming could also help stop bias from leadership ( Zettlemeyer, 2015).
References
Marsh, J., Pane, J., & Hamilton, L. (2006, November 7). Making Sense of Data-Driven Decision Making in Education. Rand.org. Retrieved February 20, 2025, from https://www.rand.org/pubs/occasional_papers/OP170.html
Sclater, N., Peasgood, A., & Mullen, J. (2016). Learning Analytics in higher education; a review of UK and International practice (pp. 1-40). JISC. https://www.jisc.ac.uk/reports/learning-analytics-in-higher-education
The Open University (2023). The Open University – Student Policies and Regulations, Data Ethics Policy. Retrieved February 20, 2025, from https://help.open.ac.uk/documents/policies/ethical-use-of-student-data
Zettelmeyer, F. (2015, May 1). A Leader’s Guide to Data Analytics. KelloggInsight. Retrieved February 20, 2025, from https://insight.kellogg.northwestern.edu/article/a-leaders-guide-to-data-analytics