Openness in open courseware (OCW) and open educational resources (OER) requires an open licence, such as Creative Commons licenses, but is affected by several factors both technological and pedagogical. This pilot study examines different factors impacting openness by looking at a very small random sample of 10 relatively recent open courseware offerings from TU Delft and MIT. This paper has two primary objectives: 1) to determine how open the sampled OCW are across eight factors of analysis; and, 2) to determine if the sampled OCW are suitable for educator reuse. The authors evaluated the sampled courses using an existing framework to conceptualize openness. The level of openness was evaluated across eight-factors: copyright/open licensing, accessibility/usability, language, support costs, assessment, digital distribution, file format, and cultural considerations. The framework describes each factor across three dimensions of openness — closed, mixed, and most open — and each author coded the sampled OCW accordingly. This content analysis provided several insights into where sampled OCW succeeded and failed in terms of openness. Courses tended to be relatively open in terms of copyright, assessment, and digital distribution, but closed in terms of language, support costs, and file format. Factors such as accessibility and cultural considerations were more mixed; discipline and course content play a factor in a course’s openness and reuse. This paper also serves a secondary purpose, on the effectiveness of the framework for assessing openness. Openness is a spectrum, with an interplay between factors that determine openness. Greater attention needs to be shown toward pedagogical considerations, rather than technical, when developing open content. Contents Introduction Literature Study design Results Study limitations Discussion Conclusion Introduction The UNESCO Recommendation on Open Educational Resources (OER) (UNESCO, 2019) represents an important development in recognition of the OER movement. While the significance of the UNESCO Recommendation should not be diminished, it clearly defines the “open” element of OER in copyright-centric terms; OER are either public domain materials or those with an open licence (UNESCO, 2019). UNESCO [1] defines OER as “teaching, learning and research materials ... that reside in the public domain or have been released under an open license.” Although an open licence for copyright-protected works is essential for resource distribution, this approach reflects an instrumental and licence-centric approach to openness and masks a more complicated relationship that open licensing has with other elements of openness. Open licensing is essential for granting permissions that allow works to be copied and shared; yet, it alone is an inadequate measure of openness because other factors dictate educators’ abilities to adapt existing materials and develop new ones. This inadequacy is made apparent when considering openness in the context of open courseware (OCW), an important, though undervalued, component of open education. The Open Education Consortium (n.d.) defines OCW as “free and open digital publication of high quality college and university-level educational materials. These materials are organized as courses, and often include course planning materials and evaluation tools as well as thematic content.” While OCW are often published by universities, the content is freely available to anyone via the Internet. Open licensing is necessary, but not sufficient for educators who might want to adapt and adopt the material. A central tenet of OCW, as the Open Education Consortium’s definition suggests, is the inclusion of course planning documents for educator reuse. McNally and Christiansen (2019) explored the question of what is “open enough?” The objective of their thought experiment was to propose reasonable goalposts around openness for educators, developing a conceptual framework for approaching the development of open courseware (OCW). The framework outlines eight factors of openness to consider when developing OCW, based on open education literature. These factors include copyright/open licensing, accessibility/usability, language, support costs, assessment, digital distribution, file format, and cultural considerations. McNally and Christiansen described how each of these factors would look under three hypothetical scenarios of closedness/openness—closed, mixed, and most open (Figure 2). McNally and Christiansen also reflected on the challenges and downsides incurred by maximizing openness, as well as the increased workload to educators, providing guidance on what to consider as “open enough” in OCW. Figure 1: List of randomly sampled OCW. Figure 2: Open enough framework (McNally and Christiansen, 2019). There are several qualitative approaches one could take to analyze OCW. However, given the lack of clear pedagogical guidelines for evaluating OCW openness, we sought to employ a novel approach as a pilot study. We report not only on how “open” OCW were and whether they would satisfy educator needs; they also reflect on the functionality of the open enough framework as an analytical tool. By employing and suggesting refinements to this framework, this paper outlines practical considerations that would be helpful to both educators interested in adapting existing OCW and administrators of OCW platforms. The findings suggest that OCW need to be developed with both technical and pedagogical considerations in mind to be truly open. Sampled OCW were less open than expected, particularly in terms of their language availability, file format, and support costs; copyright licences also potentially limited openness. We conclude that the sampled OCW were open from a licensing perspective, but better suited for student consumption rather than educator adaptation. There are relatively low-effort strategies that could be employed to improve OCW openness and make them more educator friendly. These include using less restrictive licensing terms, providing guidance and documents to aid future language translation, supplying documents in multiple file formats, and including more open reading options, including licensed library materials. The sampled OCW were relatively open in terms of their accessibility (stated adherence to commonly observed standards), the availability of assessments, and discoverability through OER search engines. Literature The ascendency of open content — from open source software, open access scholarly publications, and OER — have led to a body of scholarship examining what openness means and what unites or distinguishes various forms of open content. Prior to the emergence of OER 20 years ago, education scholarship highlighted the complicated nature of defining openness in education (Noddings and Enright, 1983) with a variety of definitions evolving over the past century (Baker III, 2017). During the relatively short history of OER, there have been attempts to define what is (or is not) OER (Pantò and Comas-Quinn, 2013). Narrow conceptions of openness tend to focus on legal aspects centred on the open licence as the essential element of openness and equating openness with access (Knox, 2013), with many early definitions of openness emphasizing the centrality of open licensing (Wiley, et al., 2014). Wiley’s (n.d.) 5Rs (retain, revise, remix, reuse, and redistribute) places legal rights conferred through an open licence as the essential and determining characteristic of openness. In addition to legal conceptions, openness is also connected to the collaborative and participatory nature of the Internet (Maxwell, 2006). Tkacz (2012) argues that openness is primarily a new techno-legal and oppositional approach aimed at countering conceptions of proprietary ownership. Richter and McPherson (2012) posit that, beyond open licensing, OER are only valuable if they fit the learner’s context and are fully adaptable. They note several non-technical or licensing barriers including language, context gaps, lack of cultural diversity, and literacy levels among other barriers to the adoption of OER. While rights-centric/open licence-oriented views of openness are common in OER literature, there exists a broader range of scholarship examining alternative definitions. There is growing recognition of the limited nature of the rights-centric open/closed binary (Rolfe, 2017; Farrow, 2017, 2016). Hodgkinson-Williams and Gray’s (2009) study of OER usage highlights four elements of openness — social, technical, legal, and financial — and constructs each element along a spectrum of openness. Pomerantz and Peek (2016) provide an extensive review of different types of open content demonstrating a range of meanings of openness. Most importantly their work reveals that there is no single unifying conception of openness across various forms of open content. Similarly, Economides and Perifanou (2018) present a typology of 11 different approaches to openness. Cronin’s (2018) doctoral dissertation contains an extensive discussion of various definitions and approaches to openness. She notes that critical approaches go beyond the simple binary of open and closed and focus on issues of risk, participation, and power. Her dissertation builds on her previous work (Cronin, 2017) where she noted four approaches to openness in education. Neylon (2017) argues that the common element of open content in academia is a return to traditional values of open knowledge exchange. However, openness has expanded and contracted in waves throughout centuries of university education (Peter and Deimann, 2013), and certain open practices and concepts (e.g., open society, open stacks/shelves, and even open source beer) pre-date both digital technologies and open licensing systems (Smith and Seward, 2017). In this respect, openness could be thought of as a form of educational transparency. Farrow (2017) underscores the importance of a reflective and strategic value of contemplating what openness means and underscores the importance of linking openness with critical pedagogy. Bayne, et al. (2015) echoes Farrow’s sentiment, while Lambert (2018) posits that the definition of open education must explicitly incorporate and emphasize social justice concepts. Finally, it is important to note that the extensive use of the term, a practice coined as “open washing” (Watters, 2014), may result in the term being conceptually nugatory (Weller, 2014). Frameworks have been developed for assessing many facets of open education and OER. Frameworks exist for examining the quality of OER (Elias, et al., 2020; Achieve, 2011; Moise, et al., 2014; Kuriolvas, et al., 2011), OER adoption (Cox and Trotter, 2017; Abeywardena, 2017), usability facets of open courseware Web sites (Rodríguez, et al., 2017), OER accessibility (Avila, et al., 2016), learning design quality (Stracke, 2019; Avila, et al., 2016), and ethical aspects of OER research (Farrow, 2016). There is even a rubric for evaluating OER quality rubrics (Yuan and Recker, 2015). Although these frameworks provide a means of assessing different aspects of OER, there are a more limited number of frameworks specifically created to examine openness. One of the earliest frameworks is Hilton, et al.’s (2010) ALMS framework. ALMS consists of four broad dimensions for educators to consider when developing open content. These dimensions include: 1) access to editing tools to edit content; 2) the level of expertise required to edit the content; 3) meaningful editability of the content (i.e., designed with editability in mind); and, 4) if the content is self-sourced. The ALMS framework underscores the importance of considering downstream users and the relationships among licence and filetype. However, ALMS, in its initial expression, lacked a rubric for evaluating openness and instead functions as a general guideline. Gurell (2012) directly addressed this shortcoming. While access to editing tools was the most challenging element to develop a rubric for, Gurell was successful in developing a scoring mechanism for open content and concluded that the ALMS framework underscored an earlier conclusion that educational effectiveness and reusability are inversely related (Wiley, 2001; Wiley, et al., 2004). Abeywardena, et al. (2012) addressed the lack of a rubric for the ALMS framework by developing a desirability measure (or D-Index) for quantifying relevance, the level of access, and the level of openness. At present, no additional attempts have been made to further develop the ALMS framework into a concrete rubric for evaluating the openness of OER and OCW. While the ALMS framework is the most developed openness framework for OER/OCW, several other approaches exist. Ehlers (2011) presents two matrices about OER and open educational practices. Stagg’s (2014) continuum of open practice arranges activities related to open education from least open (basic awareness) to most open (student co-creation of resources). Hodgkinson-Williams and Gray (2009) present four continuums that define openness. The Open Enough framework (McNally and Christiansen, 2019) is the most recent example in this area. Like the original ALMS framework, the Open Enough framework lacks any formal rubric for critically analyzing openness, but it offers more detailed guidance for evaluating openness because it provides eight factors that comprise “openness” (in the broadest sense) and it describes how those factors might be manifested across three dimensions of openness — closed, mixed, and most open (see Figure 2). Although the literature around openness in education is rich, there is little direct application of these frameworks to OCW specifically. Discussions about how openness might be conceptualized are not always helpful to “on-the-ground” educators who want to use and adapt course content. This paper attempts to fill this literature gap, and builds on McNally and Christiansen’s (2019) previous work, by employing the Open Enough framework to conduct an in-depth content analysis of selected OCW, with the goal of demonstrating which aspects of these courses are open, which aspects are closed, and how this openness/closedness affects educators who wish to adapt content. In this paper, we conducted a thorough content analysis of select OCW. Using the Open Enough framework as a coding scheme, we analyzed the content of a random selection of OCW. We pose two research questions: Question 1: Based on the eight factors of openness in the framework, how open are the sampled OCW? Question 2: Are the sampled OCW adequately designed for educator reuse or adaptation? Study design To better understand openness, how OCW can be improved, and where instructor efforts to increase openness should be directed, a comparative multi-case study involving a content analysis of 10 OCW courses was designed (Berg, 2004). The overall research design included three sequential steps: identification of institutions offering current OCW, selection of individual open courses (the cases), and content analysis of the cases using the Open Enough framework as a coding scheme. Purposive sampling of institutions was chosen, as the aim was to find insights that were generalizable to OCW broadly (Creswell, 1998). The purposive sampling was developed through a preliminary review of OCW offerings listed on the Community College Consortium for OER Web site (n.d.). The site lists OCW offerings from several institutions. Delft University of Technology (TU Delft) and MIT were chosen as the sampling databases, as both institutions (despite being science and engineering focused) offer OCW in a variety of disciplines at both the undergraduate and graduate levels. Older courses (courses designed and offered before 2016) were eliminated as newer offerings often had the benefit of modern enhancements such as updated user interfaces and updated literature. We compiled a list of the most recent courses from both databases (late 2019 to early 2020) into two spreadsheets. A total of 37 courses from MIT and 212 courses from TU Delft were compiled. However, the 212 TU Delft courses were reduced to 115 once MOOCs (Massively Open Online Courses) were excluded. MOOCs, by definition, are designed for larger enrollments (in the thousands) and less for educator reuse. To keep comparisons consistent, they were removed, however we assert that MOOCs should be examined in future research. Using a randomization formula, five courses were selected from each database (Figure 1). The version of the course (date last edited) is indicated in parentheses. The small sample of courses was deliberate; it reflects the time required to conduct an in-depth content analysis of each case, which we contend is a sufficient number of replications to separate unique phenomena from general trends (Yin, 2009). With the cases selected, we then undertook a content analysis of 10 courses using an a priori coding scheme (Wimmer and Dominik, 2011). The use of an a priori coding approach aligns with Hseih and Shannon’s (2005) discussion of directed content analysis. A directed content analysis “is to begin coding immediately with the predetermined codes. Data that cannot be coded are identified and analyzed later to determine if they represent a new category or subcategory of an existing code” [2]. A priori coding using the Open Enough framework (Figure 2) was chosen for two reasons. First, coding categories should be mutually exclusive, exhaustive, and reliable in that the coding process should yield identical or nearly identical results regardless of the coder (Wimmer and Dominik, 2011; Del Balso and Lewis, 2001).The Open Enough framework aligns with these criteria. Second, using the framework as a coding scheme also allows for the testing of the conceptual framework with real world examples. The framework provides eight factors of analysis to consider when developing OCW. These factors include copyright/open licensing, language, support costs, assessment, digital distribution, file format, and cultural considerations. Each factor’s level of openness is described across three dimensions of openness — closed, mixed, and most open. Definitions for each factor along with delineations of each factor as closed, mixed, and most open are detailed in Figure 2. The “closed” category describes the factors when there are no special considerations for openness. The “mixed” category describes the factors as resources or courses that are somewhat open but do not maximize openness. The “most open” category describes what maximizing openness would entail for each factor. We coded each case/course for all eight factors. Each factor was coded based on the framework using the codes “closed,” “mixed,” and “most open” as defined in the Open Enough paper (McNally and Christiansen, 2019). After coding independently, we discussed and mutually agreed upon final determinations. Because of the dual nature of the study — to examine both OCW openness and comment on the applicability of the Open Enough framework — inter-coder reliability is discussed in qualitative terms as opposed to quantified metrics (e.g., Krippendorff’s alpha, Cohen’s kappa, etc.). The identification of new factors resulting from the analysis is further discussed in the section entitled “Refining the Open Enough framework”. Results Summary of findings Question 1: Based on the eight factors of openness in the framework, how open are the sampled OCW? The level of openness across the sampled OCW was inconsistent. From a copyright perspective, the OCW were mixed as they employed an institutional licence across all content — a CC-NC-SA licence. In terms of adherence to accessibility standards, MIT courses were most open while TU Delft were closed. We determined accessibility openness by reviewing each OCW platform’s statement regarding which accessibility standards were observed. From a usability perspective, we concluded that TU Delft has the more modern interface, but MIT courses were more navigable. All courses (except one) were categorized as closed since the course content was offered in one language. The majority of support costs (course readings) were proprietary and many of the courses relied heavily on paid books. Two TU Delft courses were the exception to this rule; one course’s readings were completely custom (presumably written by the instructor), and another course made its proprietary ebook openly available (albeit with a closed licence). All of the sampled courses were very open in terms of digital distribution. Each course could be located using several federated OER search engines. Surprisingly, the sampled courses were categorized as closed in terms of file format, as all relied heavily on PDF content. We struggled with the cultural considerations factor, given the large volume of content and the factor’s subjective nature. Courses were split between closed and mixed/most open. Course content dictated this conclusion, as STEM courses materials were determined to be more culturally applicable, broadly speaking. Question 2: Are the sampled OCW adequately designed for educator reuse or adaptation? Given the limited editability of the course materials, we concluded that the sampled OCW were better suited for learner consumption rather than educator reuse. The lack of editability was surprising and severely limits these courses’ utility outside their originating institutions. These findings highlight the importance of prioritizing OCW for editability and adaptability, in addition to making them discoverable to the broader community. Figure 3 provides a visual overview of the coding results from this analysis. Red indicates where courses were closed, blue indicates where courses were mixed (partially open), and green indicates where openness was maximized. Figure 3: Coded OCW sample. The following subsections provide a deeper analysis of the sampled OCW. The findings are organized into subsections, corresponding to each of the eight factors as outlined in the framework, as opposed to by course, to eliminate redundant discussion as several factors were influenced institutionally rather than at a course level. Examples of openness and closedness are strategically highlighted in each of the subsections. Copyright/open licensing frameworks The copyright and open licensing factor is simultaneously paramount and insufficient for achieving openness. Without an open licence, there is no possibility for replication or reuse. Yet, OCW featuring an open licence, but without consideration for the other factors, represent simple open instrumentalism rather than a more pragmatic approach to openness. The Open Enough framework asserts that closed courses would not employ an open licence leaving all rights reserved with the rights holder (which may be the publisher, not the original author). Mixed courses employ an open licence that features restrictive elements such as the Creative Commons NonCommercial or ShareAlike conditions. The most open courses would employ a licence that allows for maximum shareability through the least restrictive licensing terms such as CC-BY or the public domain waiver. All of the courses analyzed fell under the category of a mixed resource. Both TU Delft and MIT enforce a common licence across all courses in their respective databases which is a CC-BY-NC-SA (or Attribution, Non-Commercial, ShareAlike). The NC licence element restricts the use of OCW materials to not-for-profit endeavours. On the surface, this quid pro quo approach to licensing seems reasonable, but it likely holds back the creation of new OER/OCW as combining materials with conflicting licences is prohibited (Creative Commons Wiki, 2015). Accessibility/usability formatting The accessibility/usability formatting factor evaluates OCW on their adherence to established accessibility standards and overall usability. A closed course would not adhere to any accessibility standards, except for those built into the learning management platform where the course was hosted. A mixed course would include some basic accessibility features such as closed captions for video or audio transcripts, though this category is not limited to these examples. A completely open course would adhere to established accessibility standards such as the W3C (World Wide Web Consortium) or HHS 508 (U.S. Department of Health and Human Services [HHS] Section 508) in the United States. We chose to evaluate accessibility by each platform’s self-reported adherence to a recognized accessibility standard. Five of the courses were categorized as closed (all TU Delft), and five were categorized as most open (all MIT). TU Delft had several shortcomings across its courses. Learners could search video lecture transcripts, yet there was no option to download a copy of a transcript file. Textual documents were generally available in PDF format and would presumably work with a screen reader, though the FAQ page did not explicitly say if documents or pages were validated for accessibility. TU Delft offered a help page for its OCW and provided more detailed information about screen reader compatibility, but the institution did not explicitly state if it adhered to an accessibility standard. MIT stated that “OpenCourseWare is committed to accessibility for persons with disabilities and strives to meet W3C Web Content Accessibility Guidelines (WCAG) 2.0, Level AA, including validating HTML, captioning the video, and checking the accessibility of course content as part of the authoring process” (MIT, n.d.a). All video content included closed captions, as well as an ability to download an SRT transcript file. How well each institution’s Web site actually adhered to common accessibility standards (Alt tags for images for example) was not verified. Several online tools existed for validating HTML accessibility, but errors could be found on most Web sites. To properly examine HTML accessibility, a more fulsome analysis of the Web site markup and course documents would be required. Usability was considerably more challenging to evaluate. Unless sticking to a predefined set of evaluation standards, usability was too subjective a measure to reliably label courses as closed, mixed, or most open. Some courses were more or less easy to navigate overall. TU Delft’s “Drinking Water Treatment 2” was somewhat confusing to navigate at first, due to changing navigation menus. TU Delft’s user interface is more modern and appears to work slightly better on mobile devices. MIT’s interface is generally more dated from a design perspective, but MIT courses were more straightforward in terms of their navigation, especially when moving back and forth between course sections to access content. Language The language factor is a measure of how many languages course materials had been translated to and/or the level of consideration given to guide translation. Nine of the courses were categorized as closed and one was categorized as mixed. All of the TU Delft and MIT courses were offered in English, with some of the TU Delft materials also being offered in Dutch. Class PowerPoint slides and recorded lectures (including closed captions) were largely provided in English. MIT courses were exclusively offered in English which is not surprising given that it is an American institution. Language continues to be one of the hardest elements to maximize openness and determine. Creating a bilingual course is a monumental undertaking for an individual so this factor is likely to remain more closed without financial support from educational institutions or government. The challenge occurs when assessing courses that inconsistently offer resources in different languages. For example, TU Delft’s “Drinking Water Treatment 2” included English slides and lecture videos, while some of the final assignments were provided in Dutch. The inconsistency is understandable if some of the example assignments were written in Dutch by students. TU Delft’s “Public Hygiene” course was primarily in English, but included some supplemental materials in Dutch. Support costs The support costs factor examines the openness of a course’s required and suggested readings. For example, a closed course would be reliant on a paid textbook. A mixed course might forgo a textbook to save students money but might still rely on licensed library resources which have an institutional cost and are likely not open access. The most open course would rely exclusively on openly licensed textbooks, open access readings, or materials in the public domain. Eight of the courses were categorized as closed, one was categorized as mixed, and one was categorized as most open. All MIT courses were categorized as closed because they relied heavily on paid books, and course readings lists frequently included links to Amazon for purchase. Presumably, many of these resources could be available through the MIT libraries, or a public library, but there were very few open readings that would be accessible to the public. In contrast, TU Delft’s courses offered a better mix of closed and open resources. “Public Hygiene and Epidemiology” (2016) is perhaps the best example of a completely open course as its readings were custom to the course and were openly licensed. In comparison, “The Hydrology of Catchments, Rivers and Deltas” (2016), categorized as mixed, had a proprietary textbook published by Elsevier that was available digitally for free, but a hard copy would have to be purchased. However, like the “Public Hygiene” course, there were many custom course readings as well as citations to several academic journal articles. The journal articles were not openly accessible online, but TU Delft students would presumably have access through a library. The lack of openness in support costs is somewhat surprising, and the reasons for it may be pedagogical. Many of the courses analyzed did not prioritize open readings. Some variability (two of the 10 courses) arose from situations where a larger variety of educational resources were included with varying degrees of openness. The lack of open readings in OCW highlights an important pedagogical tension in OCW design. Assessment Open content for self-directed learning is enhanced when learners can assess their progress. In traditional learning environments, whether face-to-face or online, assessment mechanisms, such as written assignments and tests/exams, are the sine que non for students to demonstrate their understanding of material. Closed represented courses with no assessment available. Mixed represented courses where assessment mechanisms were available but the learner could not effectively self-assess their performance (e.g., short answer, a long essay, or quantitative assessments with no answers provided). Most open represented courses where learners could meaningfully self-assess their understanding of the material (e.g., true and false or multiple choice with answer keys/solutions provided); however, it is important to note that maximizing the openness of assessment does not necessarily produce the most pedagogically appropriate avenues for demonstrating learning. Assessment was categorized as mixed for six of the 10 courses. Generally, course assignments were made available in the form of qualitative assessment mechanisms (e.g., essays), but self-directed learners, if they elected to undertake an assignment, would have no way of assessing their work. For example, the final assignment in MIT’s “Introduction to Art History” course was to write an essay on one of two topics [3]. While most courses that fell under the mixed category relied on essay/paper type forms of assessment, TU Delft’s “Structured Electronic Design” was a notable departure because it assigned quantitative exercises but did not provide solutions. There was some variation in the amount of instruction/guidance provided to students for essay assignments. For example, MIT’s “Equity and Inclusion” course provided six broad bullet points of guidance for writing papers; this included word count range (500–600 words), leading with a big idea, laying the ground and providing context, making points well, explaining why readers should care about the paper, and providing specific recommendations. In contrast, the MIT “Innovation Systems” course provides an extensive multi-page description of how to write a successful final paper; it included a detailed description of the assignment and topics for students to choose from, in-depth guidelines on paper structure and organization, and additional directions for each paper topic. Four of the 10 courses were categorized as most open and one was categorized as closed. TU Delft’s “System Validation” and “Drinking Water Treatment 2” courses and MIT’s “Biological Chemistry II” course had assessments that were categorized as most open. For “Biological Chemistry II”, quantitative questions and solutions were provided. Disciplinary differences explain part of this variance as courses in natural sciences and engineering tended toward more quantitative assessment. The two TU Delft courses were notably different in their approach to openness and assessment. Rather than providing quantitative questions with solutions, the assessments were more qualitative; however, examples of student work were provided. In these cases, self-directed learners could judge themselves against student work provided, although this approach was more subjective than having objective answers. TU Delft’s “Public Hygiene and Epidemiology” course was categorized as closed because it was the only course that did not include any assessment materials. Beyond a breakdown of how assignments were weighted in the final grade, self-directed learners would have no means of attempting to assess their learning. Digital distribution The digital distribution factor categorizes courses based on their discoverability. Closed courses are completely inaccessible to the public; this would include face-to-face and online courses locked behind learning management systems such as Blackboard or Moodle. Mixed courses are open to the world, but their discoverability is low; for example, OCW could be hosted through a university institutional repository that is not linked to a federated search engine. The most open courses are those with high discoverability. These open courses should be available through one or more of the major OER federated searches, such as Merlot, OER Commons, OASIS, edEX, George Mason University’s Metafinder, etc. All 10 of the courses analyzed were easily discoverable through at least one of the aforementioned search engines. This was a surprise since discoverability is so commonly cited as a barrier to OCW adoption by educators (Cortinovis, et al., 2019). Many platforms and institutions advertised their courses as being available through multiple OER databases. Open Yale advertises their courses as being available through Coursera, and they have their own Coursera home page. TU Delft advertises that its MOOCs are available through edEX and that it is a member of the Open Education Consortium. MIT’s Open Learning portal links users to all its various open courses and displays which are available through edEX. File format Like licensing, file format is a critical factor that determines the practical openness of any given OCW. Without the ability to revise and remix content, OCW loses all utility assuming you are measuring utility using a rights-centric (or technical) framework such as Wiley’s 5Rs (Wiley, n.d.). Closed courses include non-editable materials provided in proprietary file formats. Ideally, OCW would at least fall under mixed, meaning that documents and materials would be editable even if only offered in a widely used proprietary format (such as Microsoft Word/.docx). The most open courses would feature fully editable materials in an open source file type. All of the courses analyzed were categorized as closed. Text documents (notes, readings, slides, and bibliographies) were almost exclusively provided in non-editable PDF format. While choosing PDF over Microsoft Word (.docx) or Open Document Format (.odf) (either of which would likely have been the formats these materials were created in) does not necessarily undermine the learner’s experience, it cripples the course’s appeal from the perspective of educators who wish to adapt the course materials. The decision to offer course content in one format was also puzzling because uploading additional fileformats requires minimal effort or skill (McNally and Christiansen, 2019). It is possible that the lack of editable file formats is a result of institutional policy. Unlike many factors that are closed by default (e.g., All Rights Reserved for copyright), closing OCW documents requires an active choice to convert materials into non-editable formats. For example, MIT’s “Public Transportation” course has a thorough set of lecture notes, but the use of PDF-only documents makes these resources difficult to revise and adapt. Video and audio files continue to be a sticky point in file format openness. All multimedia materials were provided to users in MP3 and MP4 format. These are open file types but are less straightforward to edit without expertise using either proprietary or open editing software. What is a more or less editable video/audio file continues to be an area of contention when assessing openness. Cultural considerations The cultural consideration factor was arguably the most difficult to address. Of the 10 courses, we categorized five as most open, two as mixed, and three as closed. However, while it was easy to assess how many languages are offered, cultural considerations can be subjective. For example, TU Delft’s “Public Hygiene” course (categorized as mixed) made note that Biblical classifications of life on Earth (plants, trees, animals, and finally humans) had influenced early approaches to classification of life. It was unclear whether this constituted some sort of cultural knowledge that would create a barrier to learners and thus whether it should be categorized as closed, mixed, or even most open. Similarly, MIT’s “Public Transportation” course (categorized as mixed) made reference to transportation systems in Boston (the metropolis closest to Cambridge), but also other cities such as Vancouver. Certainly, these Western references did not count as “devoid of culturally specific material”; In the case of MIT’s “Art History” course (categorized as closed), the course description noted it specifically focused on Euro-American traditions raising the question of whether a course focusing on a specific cultural area is, by definition, closed. A fundamental problem with the cultural considerations factor was that, upon analysis, it appeared impossible to have a course dealing with a cultural element that at the same time was devoid of culturally specific material. Some of the courses, such as those offered by TU Delft, could be considered most open under the framework because the material is technical and largely devoid of jargon and cultural references. The same could be said for the “Biological Chemistry II”, and “Innovation Systems for Science” courses from MIT. Study limitations The exploratory nature of the pilot study is not without limitations. Only two institutional OCW catalogues were examined (though several others were purposively eliminated), and only 10 courses in total were analyzed. While the sample is quite small, the intent was not to generalize OCW but examine and improve the framework. Furthermore, the selection of courses, while random, still had a greater number of courses from the natural sciences and engineering than from humanities and social sciences or professional fields such as law and medicine. In some cases, factors were determined by institutional policy rather than by individual course creators. For both the digital distribution and copyright/open licensing factors, the scores were non-varied, as a result directly from sampling bias. Each institution employed the same Creative Commons license to all OCW. For digital distribution, all courses were determined to be “most open” because both TU Delft and MIT make their courses easily discoverable through several OER portals. A broader set of licences would have made for a more interesting and nuanced analysis. If multiple licences were present within an open course, which category would you associate with the course? In such an event, how would you rank OCW on a continuum of closed, mixed, and most open? This illustrates that the categories could be further divided. A half-way point between “Mixed” and “Most Open” could help remedy this problem. However, if openness is measured on a continuum, the categories could be infinite. It is possible that this limitation could be remedied by “grading” each discrete resource within an open course numerically, though doing so would not necessarily remove educator bias. An inherent limitation of rubrics and evaluation tools is their precision; a number that represents a course’s “Open Score” is satisfying, but flawed. Discussion This analysis yielded insights into how to conceptualize and practically evaluate openness as well as how openness/closedness affects educators who might wish to adapt such resources. Analyzing OCW is challenging due to the volume and variety of material, hence why this study was limited to 10 OCW cases. Across the sampled OCW, we noticed a certain lack of pragmatic implementation. To make OCW more practical for educator reuse, OCW should be developed with technical and pedagogical considerations in mind. Educators and platform directors need to ask “how will another educator use these materials?” Pedagogical elements can easily be overlooked, but they represent a dimension that — if not adequately addressed — can undermine the educational impact of OCW. Critically evaluating openness: What does it tell us about OCW? This analysis underscores that the degree of openness of content must be considered by creators before design (Hilton, et al., 2010). While the ALMS framework (Hilton, et al., 2010) identified four technological considerations in this regard, the Open Enough framework further informs creators on how to consider openness, particularly in terms of the workload associated with the development of OER/OCW. As with Gurell (2012), empirically testing frameworks for evaluating openness is a necessary step in their development and refinement. While Gurell aimed at identifying technological barriers to reuse, the Open Enough framework can help creators pragmatically increase openness during the creation process and better anticipate where to devote their time. This approach should benefit downstream reuse. If creators of OER/OCW are aware of the factors that require the least effort to open at the outset, they can more easily maximize openness in these areas without meaningfully increasing the amount of time spent on creating open content. Overall, it would appear that the variability in outcomes between factors is driven primarily by subject matter. There was a higher variability among openness among the cultural considerations and assessment factors than support costs and file format, for example. Subject matter drives the cultural considerations and assessment differences. More importantly, it would suggest that from an institutional or funder perspective, it is more pragmatic to emphasize openness in STEM where the opportunity costs of increasing openness are lowest. The examination of assessment approaches across the 10 courses raises some considerations when developing OCW. With nine of the 10 courses coded as mixed or most open, assessment appears to be an element of openness that is generally easy to achieve or has been more strictly mandated by MIT and TU Delft. Availability of assessments is unsurprising given that several quality frameworks identify it as an aspect of quality (Moise, et al., 2014; Elias, et al., 2020; Achieve, 2011). It was notable that two of the three most open courses achieved this not through the inclusion of both quantitative questions and answers but through posting sample student work. While this approach increases openness, it could have been further enhanced by providing marked copies of assignments with instructor feedback (particularly any comments that would have identified shortcomings in the assignment). In general, assessment tended toward a greater degree of openness than some of the other factors; this also reflects the uneven number of natural science and engineering courses included in this analysis. While courses in these fields more easily lend themselves to meaningful open assessment, the inclusion of more qualitative open assessments was notable in that it expanded how meaningful assessments may be designed. The lack of editable file formats across the sampled courses is such an oversight that it begs one to ask if the OCW are more suitable for student consumption than educator reuse. This problem is particularly notable given the heavy recognition in the literature of the limitations of closed file formats (Hilton, et al., 2010; Ovadia, 2019; DeVries, 2013; OECD, 2007). Mandated openness could be to blame. Individual educators who choose to make their courses available through a repository like OER Commons or Merlot might be less likely to pursue openness for reasons of personal reputation and, therefore, upload their course materials in editable formats. From a technical perspective, implementing editable file formats should take first priority for OCW repositories; without it, these courses are not functionally open. Regardless of how open a course’s content is, digital distribution is critical to OCW reuse. A lack of discoverability and awareness of OER databases among faculty has been a consistent barrier in their respective institutions (Cortinovis, et al., 2019). If learners or educators cannot locate OCW, or if courses are buried within institutional Web sites, it is unlikely they will be discovered. Content reuse and digital distribution are inherently linked. It is likely that educators who make the effort to open up their courses do so with the hope that their work will be adopted or adapted. While the discoverability of the analyzed courses is expected given the status of their respective institutions, it is encouraging to see high discoverability of OCW. This bodes well for OCW and speaks to the advancements in, and the success of, federated searches that link various repositories and university materials together. Refining the Open Enough framework Overall, the Open Enough framework provides a reasonable means for assessing openness, though additional refinements to the framework are necessary. For reasons of clarity, the “Digital Distribution” factor should be renamed “Discoverability,” and “Support Costs” should be renamed “Materials.” The “Cultural Considerations” and “Usability” factors should be removed due to their level of subjectivity. In general, science courses involved less culturally specific references; by contrast, the MIT “Art History” course was very culturally specific by nature. While we are confident in these broad observations, removing “Cultural Considerations” would reduce the level of subjectivity. We recommend that “Usability” be removed from the “Accessibility/Usability” factor in favour of just “Accessibility” since usability, like cultural considerations, is somewhat subjective and there exist many metrics for assessing the usability of Web content. Accessibility standards are easier to evaluate objectively as they do not take into account subjective design elements. The Open Enough framework should not, however, simply be about saving creators’ time for OER/OCW construction or attempting to enable greater reuse. It is designed to encourage educators to consider openness more broadly and the consequences of maximizing openness. It should push educators and learners to think about how we learn in addition to what we learn and consider the context of educational resources (Knox, 2013). In this regard, removing “Cultural considerations” and “Usability” as openness factors, but maintaining these as additional considerations is advised. “Localization,” which includes both translation into local languages and adjusting for cultural contexts, still presents significant barriers and time intensive undertakings for realizing the supposed openness of OER (Amiel, 2013; Wolfenden and Adinolfi, 2019). Creators of open content must consider not only openly licensing materials produced for a primary audience, but also for a broader, secondary audience of potential adopters and learners (DeVries, 2013). Richter and McPherson (2012) have previously suggested that a contextual description be linked to OER, which would provide reports from educators on successes and failures; while such reports may be time-intensive to create, and for adopters to review, there is a need for supplementary documents to accompany OER/OCW to facilitate reuse. Figure 4 illustrates the revised Open Enough framework. Figure 4: Revised Open Enough framework. Missing factors for assessing openness: Content volume and “harvestability” There is room for other factors when evaluating OCW openness. There is an argument for considering the ease by which users can download an entire open course. This “harvestability” is an advantage that MIT has over TU Delft that would make this analysis lopsided. The ability to download a ZIP file of course content for off-line use is potentially useful for users with limited Internet bandwidth — particularly for educators in rural areas or the nearly three billion people who have never used the Internet (ITU, 2021). Even MIT’s implementation is limited, as ZIP files do not contain course video lectures; those need to be downloaded separately. While harvestability is measurable, its nature is binary — either a course is harvestable or not. As such, harvestability is noted as the third “other consideration” and not raised to the level of a full factor (Figure 4). The volume of content in an open course also introduces important considerations. While it is not a factor of openness, volume is tied to openness; a course with a considerable volume of content is harder to make open. Volume is also tied to quality and the ability for users to pursue self-directed learning. MIT’s “Art History” course is somewhat open, but the lack of content effectively renders the course unusable. Of the 10 courses studied, MIT’s “Innovation” course was the most educationally compelling due to the depth and quality of content; however, it scored low in openness. This suggests Wiley’s (2001) nearly two-decade-old reusability paradox still holds. Two domains of openness While this study certainly does not aim to generalize the analysis of this small subset of OCW, it highlights an important delineation between open educational philosophies. The rights-centric approach, as described by Wiley’s 5 Rs (Wiley, n.d.) appears to exist in stark contrast to the (often) invisible pedagogical considerations of open education. The rights-centric approach — or “Technical Factors,” as outlined by the revised framework (Figure 4) — has dominated much of the early discussion in open education literature. This is not surprising. During the early years of the open education movement, it could be argued that getting educators to “go public” with their work took priority. Sharing materials on open platforms, in whatever form, was prioritized over not sharing. Getting educators comfortable with Internet technologies between 2001 to present, as well as open licensing models, was likely necessary for the open education movement to expand. However, as educators have become more skilled with such technologies, focus has shifted toward pedagogical factors that give OER and OCW context and broader utility, which are also conceptually more complex and contested (Tietjen and Asino, 2021). In this respect, the divide between technical and pedagogical factors comes down to educator challenges. Addressing technical factors requires that educators be willing to share their work publicly and possess the skill/knowledge to do so. When considering OER and OCW holistically, pedagogical factors are likely to require the majority of the development time, in order for educators to develop the necessary ancillary materials and provide guidance to those who wish to adopt or adapt their resources (Figure 4). Open education needs to continue to place an emphasis on these pedagogical considerations. Technical and pedagogical factors are reconciled in different ways. It is puzzling that OCW do not necessarily achieve maximum openness among technical factors given that these factors are often lowhanging fruit. Any of the courses analyzed could have been made considerably more open by including a less restrictive licence. Files could be available in multiple editable formats. Given the files were likely created in programs such as Microsoft Word or Open Office, the choice of PDF format represents an active choice of closure. Addressing the pedagogical factors is not as straightforward. Aside from being bilingual, how do you meaningfully address language? How do you address cultural considerations? A glossary of all technical terms would be welcomed by any educator hoping to adopt an existing course. Instructions for educators such as colour coding or ranking course modules on their broader applicability, or highlighting where cultural or geographic examples could be swapped out would all go toward making the course more “open.” Assessment, though generally well addressed in the sampled courses, could include more example student work with instructor comments. Greater accessibility can be achieved, but it requires knowledge of standards such as image alt tags, properly formatted text for screen readers, transcripts for audio, closed captions for video material, and some understanding of usability and design. In this respect, accessibility requires pedagogical knowledge of course structure. Finally, since the period of data collection for the study, MIT has introduced a new set of pedagogical resources to support their OCW — “OCW Educator: Instructor Insights” (MIT, n.d.-b). These pages, which complement the curricular materials for open courses, provide instructor perspectives on a range of pedagogical issues such as course design, assessment practices, and even how student time is spent in the in-person offering of the class. These pages allow other open educators to move from the tip of the iceberg to view what lies beneath the surface in the development and delivery of various courses, and thus represent an important improvement, holistically addressing open pedagogy and OCW. Conclusion There are several unanswered questions that would be well suited for follow up. Future research could take the Open Enough OCW framework and apply it to MOOCs. Alternatively, comparing a sample of MOOCS and OCW would reveal what, if any, differences exist in the level of openness and design. Also, this study could easily be expanded to include a larger sample of OCW from a more varied subset of institutions. This would provide the open education community with a valuable snapshot of the state of openness in OCW. We came to two conclusions based on the research questions posed in this study. The level of openness across the sampled OCW was inconsistent, at least across the eight factors outlined in the framework. Courses were open in terms of assessment and digital distribution but closed in terms of language, support costs, and file format. Copyright, accessibility, and cultural consideration were more mixed in terms of openness. There was a lack of general editability that made these OCW more suitable for student consumption than educator reuse. Ultimately, the results highlight the importance of considering educator needs in OCW. A rights-centric approach to openness (i.e., open licensing), while an essential component of openness, is insufficient when standing alone. About the authors Erik G. Christiansen is an assistant professor/librarian at Mount Royal University in Calgary, Alberta. His research interests include web usability, open education, scaffolded information literacy instruction, and information seeking. Erik holds a Bachelor of Arts in international relations from the University of British Columbia Okanagan and a Master’s of Library and Information Studies from the University of Alberta. E-mail: echristiansen [at] mtroyal [dot] ca Michael B. McNally is an associate professor in the Faculty of Education at the University of Alberta. His research interests include intellectual property and its alternatives, open educational resources, rural broadband policy, and digital literacy. He has a Ph.D. and MLIS from the University of Western Ontario. E-mail: mmcnally [at] ualberta [dot] ca Notes 1. UNESCO, n.d., pp. 2–3. 2. Hseih and Shannon, 2005, p. 1,282. 3. In this assignment, students address one of the following two topics: 1. What was the promise of the new technologies of photography and stereoscopy, and what were some of their problems, according to nineteenth-century commentators Charles Baudelaire and Oliver Wendell Holmes? What, in their view, is the relationship between these new media and art, and what impact do they predict these new technologies will have on art and its audiences? 2. How did women Impressionist artists negotiate the condition of modernity and its spaces of femininity in their art? To answer these questions, students write a critical analysis of Griselda Pollock’s account of modernity and the spaces of femininity, and test her arguments by applying them to selected works of art. References Ishan S. Abeywardena, 2017. “An empirical framework for mainstreaming OER in an academic institution,” Asian Association of Open Universities Journal, volume 12, number 2, pp. 230–242. doi: https://doi.org/10.1108/AAOUJ-11-2017-0036, accessed 8 December 2020. Achieve, 2011. “Rubrics for ovaluating Open education resource (OER) objects,” at https://www.achieve.org/files/AchieveOERRubrics.pdf, accessed 6 December 2020. Tel Amiel, 2013. “Identifying barriers to the remix of translated open educational resources,” International Review of Research in Open and Distributed Learning, volume 14, number 1, pp. 126–144. doi: https://doi.org/10.19173/irrodl.v14i1.1351, accessed 5 December 2020. Cecilia Avila, Silvia Baldiris, Ramon Fabregat, and Sabine Graf, 2016. “Cocreation and evaluation of inclusive and accessible open educational resources: A mapping toward the IMS caliper,” IEEE Revista Iberoamericana de Tecnologias del Aprendizaje, volume 11, number 3, pp. 167–176. doi: https://doi.org/10.1109/RITA.2016.2589578, accessed 8 November 2021. Frederick W. Baker, III, 2017. “An alternative approach: Openness in education over the last 100 years,” Tech Trends, volume 61, pp. 130–140. doi: https://doi.org/10.1007/s11528-016-0095-7, accessed 12 December 2020. Siân Bayne, Jeremy Knox, and Jen Ross, 2015. “Open education: The need for a critical approach,” Learning, Media and Technology, volume 40, number 3, pp. 247–250. doi: https://doi.org/10.1080/17439884.2015.1065272, accessed 13 December 2020. Bruce L. Berg, 2004. Qualitative research methods for the social sciences. Fifth edition. Boston, Mass.: Pearson. Community College Consortium for OER, n.d. “Open CourseWare,” at https://www.cccoer.org/learn /find-oer/open-courseware/, accessed 9 March 2021. Renato Cortinovis, Alexander Mikroyannidis, John Domingue, Paul Mulholland, and Robert Farrow, 2019. “Supporting the discoverability of open educational resources,” Education and Information Technologies, volume 24, pp. 3,129–3,161. doi: https://doi.org/10.1007/s10639-019-09921-3, accessed 8 December 2021. Glenda Cox and Henry Trotter, 2017. “An OER framework, heuristic and lens: Tools for understanding lecturers’ adoption of OER,” Open Praxis, volume 9, number 2, pp. 151–171. doi: https://doi.org/10.5944/openpraxis.9.2.571, accessed 8 December 2021. Creative Commons Wiki, 2015. “ShareAlike compatibility,” at https://wiki.creativecommons.org /wiki/ShareAlike_compatibility, accessed 9 March 2021. Catherine Cronin, 2018. “Openness and praxis: A situated study of academic staff meaning-making and decision-making with respect to openness and use of open educational practices in higher education,” Ph.D. dissertation, National University of Ireland, Galway, at https://aran.library.nuigalway.ie/handle /10379/7276, accessed 9 March 2021. Catherine Cronin, 2017. “Openness and praxis: Exploring the uses of open educational practices in higher education,” International Review of Research in Open and Distributed Learning, volume 18, number 5. doi: https://doi.org/10.19173/irrodl.v18i5.3096, accessed 7 April 2021. Michael Del Balso and Alan D. Lewis, 2001. First steps: A guide to social research. Second edition. Toronto: Nelson Thomson Learning. Irwin DeVries, 2013. “Evaluating open educational resources: Lessons learned,” Procedia — Social and Behavioral Sciences, volume 83, pp. 56–60. doi: https://doi.org/10.1016/j.sbspro.2013.06.012, accessed 8 November 2020. Anastasios A. Economides and Maria A. Perifanou, 2018. “The many faces of openness in education,” Proceedings of International Conference on Education and New Learning Technologies. doi: https://doi.org/10.21125/edulearn.2018.0943, accessed 9 March 2021. Ulf-Daniel Ehlers, 2011. “Extending the territory: From open educational resources to open educational practice,” Journal of Open, Flexible, and Distance Learning, volume 15, number 2, at https://www.jofdl.nz/index.php/JOFDL/article/view/64, accessed 13 November 2020. Mirette Elias, Allard Oelen, Mohammadreza Tavakoli, Gábor Kismihok, and Sören Auer, 2020. “Quality evaluation of open educational resources,” In: Carlos Alario-Hoyos, María Jesús RodríguezTriana, Maren Scheffel, Inmaculada Arnedillo-Sánchez, and Sebastian Maximilian Dennerlein (editors). Addressing global challenges and quality education. Lecture Notes in Computer Science, volume 12315. Cham, Switzerland: Springer, pp 410–415. doi: https://doi.org/10.1007/978-3-030-57717-9_36, accessed 8 January 2021. Robert Farrow, 2017. “Open education and critical pedagogy,” Learning Media and Technology, volume 42, number 2, pp. 130–146. doi: https://doi.org/10.1080/17439884.2016.1113991, accessed 6 November 2020. Robert Farrow, 2016. “A framework for the ethics of open education,” Open Praxis, volume 8, number 2, pp. 93–109. doi: https://dx.doi.org/10.5944/openpraxis.8.2.291, accessed 4 November 2020. Seth Michael Gurell, 2012. “Measuring the technical difficulty in resusing open educational resources with the ALMS analysis framework,” Ph.D. dissertation, Brigham Young University, at https://scholarsarchive.byu.edu/etd/3472/, accessed 22 March 2021. John Hilton III, David Wiley, Jared Stein, and Aaron Johnson, 2010. “The four ’R’s of openness and the ALMS analysis: Frameworks for open educational resources,” Open Learning, volume 25, number 1, pp. 37–44. doi: https://doi.org/10.1080/02680510903482132, accessed 19 October 2020. Cheryl Hodgkinson-Williams and Eve Gray, 2009. “Degrees of openness: The emergence of open educational resources at the University of Cape Town,” International Journal of Education and Development using Information and Communication Technology, volume 5, number 5, pp. 101–116, and at https://open.uct.ac.za/handle/11427/8860, accessed 3 November 2020. Hsiu-Fang Hseih and Sarah E. Shannon, 2005. “Three approaches to qualitative content analysis,” Qualitative Health Research, volume 15, number 9, pp. 1,277–1,288. doi: https://doi.org/10.1177/1049732305276687, accessed 2 February 2022. International Telecommunications Union (ITU), 2021. “2.9 billion people still offline” (30 November), at https://www.itu.int/en/mediacentre/Pages/PR-2021-11-29-FactsFigures.aspx, 8 December 2021. Jeremy Knox, 2013. “The limitations of access alone: Moving toward open processes in education technology,” Open Praxis, volume 5, number 1, pp. 21–29. doi: https://dx.doi.org/10.5944/openpraxis.5.1.36, accessed 5 November 2020. Eugenijus Kurilovas, Virginija Bireniene, and Silvija Serikoviene, 2011. “Methodology for evaluating quality and reusability of learning objects,” Electronic Journal of e-Learning, volume 9, number 1, pp. 39–51, at https://www.academic-publishing.org/index.php/ejel/article/view/1604/1567, accessed 5 November 2020. Sarah R. Lambert, 2018. “Changing our (dis)course: A distinctive social justice aligned definition of open education,” Journal of Learning for Development, volume 5, number 3, pp. 225–244. doi: https://doi.org/10.56059/jl4d.v5i3.290, accessed 9 March 2021. Elliot Maxwell, 2006. “Open standards, open source and open innovation: Harnessing the benefits of openness,” Innovations: Technology, Governance, Globalization, volume 1, number 3, pp. 119–176. doi: https://doi.org/10.1162/itgg.2006.1.3.119, accessed 12 September 2022. Michael B. McNally and Erik G Christiansen, 2019. “Open enough? Eight factors to consider when transitioning from closed to open resources and courses: A conceptual framework,” First Monday, volume 24, number 6, at https://firstmonday.org/article/view/9180/7808, accessed 3 June 2019. doi: https://doi.org/10.5210/fm.v24i6.9180, accessed 3 June 2019. MIT, n.d.-a. “FAQ: Technology,” at https://mitocw.ups.edu.ec/help/faq-technology/, accessed 9 March 2021. MIT, n.d.-b. “OCE Educator: Instructor Insights,” at https://ocw.mit.edu/courses/instructor-insights/, accessed 9 March 2021. Gabriela Moise, Monica Vladoiu, and Zoran Constaninescu, 2014. “MASECO: A multi-agent system for evaluation and classification of OERs and OCW based on quality criteria,” In: Mirjana Ivanović and Lakhmi C. Jain (editors). E-learning paradigms and applications: Agent-based approach. Studies in Computational Intelligence, volume 528. Berlin: Springer. doi: https://doi.org/10.1007/978-3-642-41965-2_7, accessed 30 October 2020. Carmen Neylon, 2017. “Openness in scholarship: A return to core values?” In: Leslie Chan and Fernando Loizides (editors). Expanding perspectives on open science: Communities, culture and diversity. Amsterdam: IOS Press, pp. 6–17. doi: https://doi.org/10.3233/978-1-61499-769-6-6, accessed 12 September 2022. Nel Noddings and D. Scott Enright, 1983. “The promise of open education,” Theory into Practice, volume 22, number 3, pp. 182–189. doi: https://doi.org/10.1080/00405848309543059, accessed 12 September 2022. Open Education Consortium, n.d. “What is open courseware?” at https://www.oeconsortium.org /faq/what-is-open-courseware/, accessed 8 December 2021. Organisation for Economic Cooperation and Development (OECD), 2007. “Giving knowledge for free: The emergence of open educational resources,” at https://www.oecd.org/education/ceri/38654317.pdf, accessed 7 December 2020. Steven Ovadia, 2019. “Addressing the technical challenges of open educational resources,” portal: Libraries and the Academy, volume 19, number 1, pp. 79–93. doi: https://doi.org/10.1353/pla.2019.0005, accessed 13 September 2022. Eleonora Pantò and Anna Comas-Quinn, 2013. “The challenge of open education,” Journal of e-Learning and Knowledge Society, volume 9, number 1, pp. 11–22. doi: https://doi.org/10.20368/1971-8829/798, accessed 8 December 2020. Sandra Peter and Markus Deimann, 2013. “On the role of openness in education: A historical reconstruction,” Open Praxis, volume 5, number 1, pp. 7–14. doi: http://dx.doi.org/10.5944/openpraxis.5.1.23, accessed 24 February 2020. Jeffery Pomerantz and Robin Peek, 2016. “Fifty shades of open,” First Monday, volume 21, number 5, at https://firstmonday.org/article/view/6360/5460, accessed 17 February 2020. doi: https://doi.org/10.5210/fm.v21i5.6360, accessed 12 September 2022. Thomas Richter and Maggie McPherson, 2012. “Open educational resources: Education for the world?” Distance Education, volume 33, number 2, pp. 201–219. doi: https://doi.org/10.1080/01587919.2012.692068, accessed 17 February 2020. Germania Rodríguez, Jennifer Pérez, Samantha Cueva, and Rommel Torres, 2017. “A framework for improving Web accessibility and usability of Open Course Ware sites,” Computers and Education, volume 109, pp. 197–215. doi: https://doi.org/10.1016/j.compedu.2017.02.013, accessed 17 May 2020. Vivien Rolfe, 2017. “Striving toward openness: But what do we really mean?” International Review of Research in Open and Distributed Learning, volume 18, number 7, pp. 75–88. doi: https://doi.org/10.19173/irrodl.v18i7.3207, accessed 17 March 2020. Matthew Longshore Smith and Ruhiya Seward, 2017. “Openness as social praxis,” First Monday, volume 22, number 4, at https://firstmonday.org/article/view/7073/6087, accessed 8 March 2020. doi: https://doi.org/10.5210/fm.v22i4.7073, accessed 12 September 2022. Adrian Stagg, 2014. “OER adoption: A continuum for practice,” International Journal of Educational Technology in Higher Education, volume 11, number 3, pp. 151–165. doi: https://doi.org/10.7238/rusc.v11i3.2102, accessed 23 March 2020. Christian M. Stracke, 2019. “Quality frameworks and learning design for open education,” International Review of Research in Open and Distributed Learning, volume 20, number 2, pp. 180–203. doi: https://doi.org/10.19173/irrodl.v20i2.4213, accessed 22 February 2020. Phil Tietjen and Tutaleni I. Asino, 2021. “What is open pedagogy? Identifying commonalities,” International Review of Research in Open and Distributed Learning, volume 22, number 2, pp. 186–204. doi: https://doi.org/10.19173/irrodl.v22i2.5161, accessed 2 December 2021. Nathaniel Tkacz, 2012. “From open source to open government: A critique of open politics,” Ephemera: Theory & Politics in Organization, volume 12, number 4, pp. 386–405, and at http://www.ephemerajournal.org/contribution/open-source-open-government-critique-open-politics-0, accessed 9 March 2021. UNESCO, 2019. “Draft recommendation on open educational resources” (8 October), at https://unesdoc.unesco.org/ark:/48223/pf0000370936, accessed 9 March 2021. UNESCO, n.d. “Open educational resources,” at https://en.unesco.org/themes/building-knowledgesocieties/oer, accessed 8 December 2021. Audrey Watters, 2014. “From ‘open’ to justice” (16 November), at http://hackeducation.com/2014/11 /16/from-open-to-justice, accessed 23 March 2021. Martin Weller, 2014. The battle for open: How openness won and why it doesn’t feel like victory. London: Ubiquity Press. doi: https://doi.org/10.5334/bam, accessed 1 April 2020. David Wiley, n.d. “Defining the ‘open’ in open content and open educational resources,” at http://opencontent.org/definition/, accessed 23 February 2020. David Wiley, 2001. “The reusability paradox,” at https://opencontent.org/docs/paradox.html, accessed 23 Feburary 2020. David Wiley, T.J Bliss, and Mary McEwan, 2014. “Open educational resources: A review of the literature,” In: J. Michael Spector, M. David Merrill, Jan Elen, and M.J. Bishop (editors). Handbook of research on educational communications and technology. New York: Springer, pp. 781–789. doi: https://doi.org/10.1007/978-1-4614-3185-5_63, accessed 2 March 2020. David Wiley, Sandie Waters, Deonne Dawson, Brent Lambert, Matthew Barclay, and David Wade, 2004. “Overcoming the limitations of learning objects,” Journal of Educational Multimedia and Hypermedia, volume 13, number 4, pp. 507–521, and at https://www.learntechlib.org/primary/p/6586/, accessed 12 September 2022. Roger D. Wimmer and Joseph R. Dominick, 2011. Mass media research: An introduction. Ninth edition. Boston, Mass.: Cengage-Wadsworth. Frenda Wolfenden and Lina Adinolfi, 2019. “An exploration of agency in the localisation of open educational resources for teacher development,” Learning, Media and Technology, volume 44, number 3, pp. 327–344. doi: https://doi.org/10.1080/17439884.2019.1628046, accessed 8 November 2020. Robert K. Yin, 2009. Case study research: Design and methods. Fourth edition. Los Angeles, Calif.: Sage. Min Yuan, and Mimi Recker, 2015. “Not all rubrics are equal: A review of rubrics for evaluating the quality of open educational resources,” International Review of Research in Open and Distributed Learning, volume 16, number 5, pp. 16–38. doi: https://doi.org/10.19173/irrodl.v16i5.2389, accessed 6 November 2020. Editorial history Received 17 May 2021; revised 23 February 2022; accepted 13 September 2022. This paper is licensed under a Creative Commons Attribution 4.0 International License. Examining the technological and pedagogical elements of select open courseware by Erik G. Christiansen and Michael B. McNally. First Monday, Volume 27, Number 10 - 3 October 2022 https://firstmonday.org/ojs/index.php/fm/article/download/11639/10712 doi: https://dx.doi.org/10.5210/fm.v27i10.11639