ࡱ> #` DWbjbj5G5G 7W-W-6O ^^^^^^^rIII8JKrChjKKKKKLLLggggggg$ihlg^QLLQQg^^KKgSSSQ|^K^KgSQgSSr6^T^^6_KK ޫI,RX^bh0Ch^lRl6_6_8l^n_pLVMSNOLLLgg$SdLLLChQQQQrrr,0rrr0rrr^^^^^^ To: John Nixon, VP, Instructional Services Virginia Burley, Dean, Instructional Services From: Jemma Blake-Judd, Coordinator, SLOs/AUOs Implementation Team Subject: Final Summary/Evaluation of SLOs/AUOs Team Year-Three Implementation Activities Date: June 10, 2007 The SLOs/AUOs Implementation Team has focused on four areas in the final year of the implementation process: Campus-wide participation Effective communication and documentation Quality of the SLOs/AUOs efforts Team support of the Planning for Institutional Effectiveness Process (PIE) and the Conversion to Trac Dat These areas have informed the coordinators reports to the campus and will serve as the framework for this final summary/evaluation of the teams work. Because this is the final year, a section entitled Future Considerations has been added Area One: Campus-Wide Participation Summary of Team Activities: When the final year of SLOs/AUOs implementation began in fall of 2006, 60% of the colleges 158 departments/disciplines/units were working on outcomes/objectives. The SLOs/AUOs team continued to meet the same challenges it had met during the first two years. Those departments/units that had moved easily into outcomes assessment in the first year continued their work, and, in many cases, moved directly into a second round of outcomes assessment while those areas that had showed no interest during the previous two years fell into several categories: Departments whose faculty were willing to work on outcomes but were stymied by a resistant department chair Departments whose faculty were willing to work on outcomes, but were dominated by 2 or 3 outspoken opponents Departments whose faculty had rarely participated in any projects or committees on campus Evaluation of Team Activities: In June of 2007, at the end of year-three, 80% of the college is working on SLOs/AUOs, 72% on Means of Assessment, and 35% on Summary of Data and Use of Results. While a portion of these significant gains can be attributed to the teams continued attempts to work with resistant departments, the major changes came as a result of two external forces. First, the installation of a number of new department chairs prompted, as in the case of the Nursing department, a decision to begin SLOs work. The second occurred with the distribution of the 2006-2007 PIE form. Deans and Associate Deans actively touted the merits of the process and reasserted the direct connection between evidence presented in PIE and consideration of Budget requests, which prompted departments like Physics to begin working on outcomes after a two year opposition to the process. Recently Mt.91ֱs definition of campus-wide SLOs participation came into direct conflict with the AACJC request for an update on the colleges SLOs work. This request conveyed the expectation that the college have permanent SLOs for every class it offers and that those SLOs are being assessed using pre-determined methods semester after semester. The Coordinators response to ACCJC outlined the ways in which this expectation was contradictory to the Nichols model and adjusted the format of the ACCJCs matrix to depict the percentage of disciplines/departments working on SLOs. It remains to be seen how the commission will react to Mt. 91ֱs approach. Future Considerations: While the new Coordinator will continue to work toward the goal of 100% participation in outcomes/objectives assessment, this goal is secondary to quality assurance issues that must be addressed (see Quality of SLOs/AUOS Efforts following). Area Two: Effective Communication and Documentation Summary of Team Activities: Team Communication 2006-2007 saw fewer informal SLOs Team meetings as the coordinators availability was greatly impacted by the accreditation midterm report, but the teams formal meetings were extended to two hours, and the team met several times over fall and spring to evaluate the 5 column models being generated on campus (see Quality of SLOs/AUOs Efforts following). Campus Wide Communication Communication of SLOs/AUOs information continued to occur through the Coordinators appearances before the academic senate, management teams, and instructional divisions. These appearances were once again augmented by beginning and end-of -semester informational emails and campus update spread sheets as well as the bi-monthly electronic newsletters. Documentation of all SLOs/AUOs work on campus continued to occur through Quick Place, which will officially close in June, but will be available for the accreditation site visit in 2010. Beginning in fall 2007, Trac Dat will house all SLOs/AUOs documentation and supporting data, and its reports will facilitate communication of the colleges progress toward its goals. Formal and Informal Dialogs with Other Colleges Conversations with other colleges slowed in year-three as the other institutions became immersed in their own processes, but the continual requests for information on how Mt. 91ֱ put together its SLOs web site, how the team documents SLOs work and how the college reconciled program review processes with outcomes assessment reinforces the fact that Mt. 91ֱ continues to lead outcomes efforts state-wide. Evaluation of Team Activities: As in the previous two years, the best indicator of the effectiveness of the efforts at communication was the increase in campus-wide participation. In year three, Mt. 91ֱ moved from 11% of the campus working on Summary of Data in May 2006 to 35% in May of 2007. Future Considerations: The new SLOs Coordinator is aware of the need to continue campuswide communication efforts and will utilize email announcements and public appearances to address this issue. Area Three: Quality of the SLOs/AUOs efforts Summary of Team Activities: In year three, the team participated in a number of activities to insure quality SLOs/AUOs work on campus. One of the most important of these occurred with the hiring of the Educational Research Assessment Analyst. One of the contributing factors to low levels of SLOs participation in California Community Colleges has been traced to a disconnect between the Research Office and the individuals coordinating the outcomes efforts. With this in mind, the SLOs Coordinator spent the winter 2007 intersession training the new researcher on the Nichols model, accreditation standards, the team facilitation and documentation processes, the team generated rubric for the assessment of 5 column models, the Planning for Institutional Effectiveness process(PIE), and campus culture and politics. The Researcher then wrote a summary of the role of research in SLOs/AUOs work on campus for the Department of Research and Institutional Effectiveness web site. In spite of the continual notices to the campus via email and Coordinator appearances, the team did not work with as many departments and areas as it would have liked. In order to assess the quality of the outcomes work occurring without the teams input, the Coordinator requested copies of all PIEs submitted on campus. The Coordinator distributed those PIEs to the appropriate SLOs team members who then contacted department/unit representatives with suggestions for improvement to the 5 column models and offers of future assistance. The departments work was then documented in Quick Place. Finally, the team rated a representative sample of all 5 column models that had been created and turned in during the 2006-2007 academic year. Over the course of two semesters, the team continued to update its Web Site. The most recent addition to the site was a modified rubric available to any departments working on SLOs/AUOs as a means of evaluating the quality of their work. This rubric is also included as an appendix in the Trac Dat User Guide (see Team Support of PIE and the Conversion to Trac Dat following). Evaluation of Team Activities The team used two assessment methods to evaluate the effective engagement of departments/units in the outcomes process. The first method required a simple tally to determine the degree of increase in SLOs/AUOs participation campus-wide, and the second stipulated 50% of a representative sample will be rated as acceptable or above on each completed column. To address this statement, the Team rated 107 models (77 SLOs, 30 AUOs) from 66 college units/offices/disciplines, with the greatest representation from the Instructional Services and Administrative Services areas. It is important to note that 28 models, done independently of the Team, were extracted from the PIE documents. Since PIE does not require a specific entry for the mission/goals, Column 1 information for these models was missing. Each model had at least two raters, with 74% of the models having at least four raters. If there was a tie in the responses of the raters (always between below and meeting standards), the column rating was unable to be determined and, for the purpose of comparison, excluded. Thus, the following is a table of results:  COLUMN 1 Mission & Goals COLUMN 2 Outcomes/ObjectivesCOLUMN 3 Means of AssessmentCOLUMN 4 Summary of Data COLUMN 5 Use of Results2007200620072006200720062007200620072006STANDARDSn%n%n%n%n%n%n%n%n%N%Below 4253%8351%2826%4729%4644%6659%3455%1667%2959%1376%Meets or Exceeds3544%8049%6359%11671%5149%4641%2134%833%1735%424%Unable to Determine23%--1615%--77%--711%--36%--Total79100%163100%107100%163100%104100%112100%62100%24100%49100%17100%REPRESENTATION FROM MODELS74%100%100%100%97%69%58%15%46%10% A total of 107 models were rated in 2007, as opposed to 163 in 2006. Although there were fewer models to rate, there was an increase in the participation of models in columns further in the model, such as Columns 3, 4, and 5. The Representation from Models indicates that each of the last three columns saw a great increase in participation. In examining the 2007 results, over 50% of the models were rated as meeting or exceeding standards in Column 2 only. Column 1 results may have been skewed because of the absence of Column 1 on PIE documents from the aforementioned 28 models. Although Columns 4 and 5 did not meet the designated percentage, there was still an increase in the number and percentage of models that met or exceeded standards. Furthermore, Columns 3, 4, and 5 experienced an increase in participation from the previous year. This is a very positive sign indicating that a greater portion of the campus is getting further in the assessment process and working to increase the quality of its assessment elements in meeting or exceeding standards. Based on these results, a few recommendations can be made to further streamline the process. The rubric, including the embedded checklists, can be reviewed to further standardize the definitions of below, meeting, and exceeding standards. A few of the ratings could not be categorized in the table above because there was an even split between raters who thought it was below or meeting standards and perhaps a revision of standards and the rating process could advance them into a standards category. Thus, although a majority of the model ratings worked well, a redefinition with a pilot run as well as periodic reviews would help standardize the rubric and its elements. Furthermore, since many of the models were below standards in Columns 1, 3, 4, and 5, an analysis of these models could be performed to localize certain persistent challenges. These challenges, depending on their breadth, could be addressed through individual meetings with the departments or workshops. Overall, with an increase in campus participation of the assessment process as well as having more units and offices move further within the process, there is great opportunity to continue to modify the assessment tools and processes for campus growth and development. Discipline/Department: SLOs/AUOs Team Recorded by SLOs/AUOs Team Coordinator: Jemma Blake-Judd Administrative Unit Objectives Assessment Model: The purpose of this assessment process is to improve service Mission & GoalsIntended Objectives 2-10-07 Means of Assessment 2-10-07Summary of Data Collected 6-10-07Use of Results 6-10-07College To provide quality transfer, career and lifelong programs that prepare students with the knowledge and skills needed for success in an interconnected world. Team: The SLOs/AUOs Implementation Team provides training and resources for faculty, staff, and mangers while documenting and ensuring the quality of the SLOs/AUOs process 1.The SLOs/AUOs Team will increase the number of dept/unit participants who effectively engage in the SLOs/AUOs process1a. Using the SLOs/AUOs May 2007 Team Campus Update as compared to the May 2006 update, SLOs/AUOs participation in all 158 depts./units will increase to 95% and means of assessment participation will increase to 75 %as calculated by the ERRA by June 1, 2007. 1b. Using the SLOs/AUOs Team generated rubric, 50% of a representative sample of 5 column models submitted will be rated acceptable or above on each completed column as evaluated by SLOs/AUOs Team members and calculated by the ERRA by June 1, 2007 1a. May 2006: SLOs/AUOs 64% Means of Assessment: 45% May 2007: SLOs/AUOs:80% Means of Assessment :72% 1b .Over 50% of the models were rated as meeting or exceeding standards in Column 2 only. Column 1 results may have been skewed because of the absence of Column 1 on PIE documents. Although Columns 4 and 5 did not meet the designated percentage, there was still an increase in the percentage of models that met or exceeded standards. Furthermore, Columns 3, 4, and 5 experienced an increase in participation from the previous year. 1a.While there has been a significant increase in SLOs/AUOs activity, at this point, a comparison of participation in 2006 to that in 2007 is not possible as no current data is available on Administrative Services (21 units/depts) or Student Services (15 units/depts).The SLOs Coordinator will update the summary of data and use of results columns as soon as the data is available. 1b. A greater portion of the campus is getting further in the assessment process and working to increase the quality of its assessment elements in meeting or exceeding standards. Although a majority of the model ratings worked well periodic reviews would help standardize the rubric and its elements. Furthermore, since many of the models were below standards in Columns 1, 3, 4, and 5, an analysis of these models should be performed to localize certain persistent challenges. These challenges, depending on their breadth, could be addressed through individual meetings with the departments or workshops. Future Considerations: The quality assurance issue is not something that will disappear when the implementation phase is over. While Trac Dat does promote better quality in the SLOs/AUOs process through the use of fields that prompt the inclusion of requisite details, the user may choose to ignore those fields; therefore, it is necessary to establish a quality assurance process in which the SLOs and AUOs being created across campus would be monitored through a periodic review of the 5 columns housed in Trac Dat and department representatives contacted if there are problems. The quality assurance process should also provide for workshops on the most difficult aspect of the outcomes/objectives process, determination of a means of assessment, for cross over credit through the Office of Professional and Organizational Development. Research across the country over the past twenty years has shown that the quality of outcomes assessment will be compromised if there is a punitive connection between assessment results and faculty evaluations. Mt. 91ֱs refusal to comply with the accreditation standard requiring a connection between SLOs work and faculty evaluations, and the resulting resolution written on that topic will go a long way toward insuring that its faculty feel free to engage in meaningful outcomes assessment. Finally, the SLOs web site will provide support for the outcomes/objectives efforts on campus. Campus constituents should be able to find current information about Mt. 91ֱs activities along with recent articles from experts in the field and resources such as the SLOs Team rubric for the assessment of 5 column models. The SLOs Coordinator will continue to monitor and update the site as needed. Area Four: Team Support of the Planning for Institutional Effectiveness Process (PIE) and the Conversion to Trac Dat Summary of Team Activities: The team received essentially the same number of requests for help with PIE in 2006-2007 as it did in 2005-2006, but the assistance the faculty and managers required was, more often than not, reassurance that the work was correct. The 2006-2007 PIE forms and PIE Summaries copied to the SLOs Coordinator showed a significantly better understanding of the process. The only problems that surfaced were in the outcomes included in the PIE forms submitted by departments that had engaged in the process without SLOs team assistance. Trac Dat and the new Accreditation ifolio will provide Mt. 91ֱ with an efficient means of documenting Planning for Institutional Effectiveness efforts and reporting the colleges progress toward its goals. The SLOs Coordinator and team spent dozens of hours working with the Information Technology Specialist on the conversion of the existing PIE process to Trac Dat. This conversion included both a modification of the appearance of Trac Dats screens and the creation of a Trac Dat user-guide. The guide was set up to contain corresponding elements of the PIE paper form on the page opposite the Trac Dat Screen image to assist the trainees in their understanding of the similarity between the paper and electronic process. When it came time to set up the format for the training process, the team suggested that the training would be greatly improved if the departments could see their previous years PIE on the Trac Dat screen during the training sessions. As a result, in spring 2007, all departments/units were asked to submit electronic copies of their PIEs to the IT specialist to be uploaded to Trac Dat before the training sessions begin in the summer of 2007. It was also determined that the SLOs Coordinator should be available at all Trac Dat training sessions to address any SLOs/AUOs questions. Evaluation of Team Activities: The SLOs team members successfully modified the Trac Dat software to match the current PIE process, and at the same time, they created an effective training format and user- friendly guide. Further modifications will take place after the first series of evaluations are conducted during the summer training sessions. Future Considerations: Trac Dat will be evaluated in a number of ways: through evaluations completed at training sessions, through questions in the next employee survey, and through anecdotal evidence from users. The Trac Dat reporting mechanism will also be evaluated by the Institutional Effectiveness Committee in order to meet accreditation standards IB.6 and I.B.7 which call for scrutiny of the colleges evaluative mechanisms. Conclusion Research around the state of California has revealed that while many colleges are actively engaged in outcomes assessment, they are facing serious difficulties sustaining and documenting those efforts and connecting outcomes assessment to institutional effectiveness processes. Although Mt. 91ֱ has not achieved 100% participation in its SLOs/AUOs implementation, it has created a viable connection between outcomes/objectives and planning for institutional effectiveness. It also has a sophisticated documentation and reporting system in place, along with a dedicated researcher, permanent SLOs and Gen Ed Coordinator positions, an active Gen Ed committee, and a well established Web Site-all of which should ensure the college will continue to serve its students and the community well.     T_y  ! 4 I S a h  񣔈|pdXdpXdh~CJOJQJaJhaCJOJQJaJhECJOJQJaJhRCJOJQJaJhOCJOJQJaJhSh-CJOJQJaJh![,CJOJQJaJhU}CJOJQJaJhSh!)CJOJQJaJhShSCJOJQJaJhnCJOJQJaJhDCJOJQJaJhShCJOJQJaJ!,`a  P Q C D h gdTY & FgdbugdL(gd & Fgd6WCW  # $ 0 O Q b   $ 2 7 B D L g vj͂j͂ZMha5CJOJQJaJhSh5CJOJQJaJhaCJOJQJaJh`CJOJQJaJhRCJOJQJaJhogCJOJQJaJh$0cCJOJQJaJhShCJOJQJaJhOCJOJQJaJhECJOJQJaJhU}CJOJQJaJhEmCJOJQJaJhShCJOJQJaJh,CJOJQJaJg h % & ' ( + 6 : k ¶reVVVJ’VJVhbuCJOJQJaJhShCJOJQJaJh5CJOJQJaJh`h5CJOJQJaJh,h5CJOJQJaJh~CJOJQJaJhxbmCJOJQJaJhCJOJQJaJhL(CJOJQJaJh{CJOJQJaJh56CJOJQJaJ"h~h56CJOJQJaJh~h5CJOJQJaJk q   , C K [ _ 5:=^tĸĬĸРĔ||ph{CJOJQJaJh;CJOJQJaJhU}CJOJQJaJhRCJOJQJaJhaCJOJQJaJhxbmCJOJQJaJhbuCJOJQJaJh~CJOJQJaJhogCJOJQJaJh2~CJOJQJaJhCJOJQJaJhL(CJOJQJaJ) 03>[\_`cmvw|~ܣxl\PlDPlP\h;CJOJQJaJhECJOJQJaJh)%hE5CJOJQJaJh'CJOJQJaJh,hO5CJOJQJaJhFCJOJQJaJhbuhbuCJOJQJaJhOCJOJQJaJ"h~hTY56CJOJQJaJh2 56CJOJQJaJh2 CJOJQJaJhbuCJOJQJaJhL(CJOJQJaJh>rCJOJQJaJ\]^ 1M`yz{gdgdgdugdgdxbmgdTY~')l7:=NS\]^g˿˧כ׃wwkwwwkw_kShRNCJOJQJaJhCJOJQJaJh2~CJOJQJaJhxbmCJOJQJaJhaCJOJQJaJhCJOJQJaJh;CJOJQJaJh'CJOJQJaJh*CJOJQJaJhbuCJOJQJaJh{CJOJQJaJh,CJOJQJaJhOCJOJQJaJh)%hO5CJOJQJaJ gjm)0?@A %,cmxĸܠДvjvh,CJOJQJaJh,hRCJOJQJaJh,hFCJOJQJaJhFCJOJQJaJhCJOJQJaJh;CJOJQJaJh0#RCJOJQJaJh`CJOJQJaJhRNCJOJQJaJh2~CJOJQJaJhRCJOJQJaJh{CJOJQJaJ& "1M`d÷zjXH<h0#RCJOJQJaJhShTn6CJOJQJaJ"hGh56CJOJQJaJhY(h5CJOJQJaJhY(5CJOJQJaJhCJOJQJaJh;CJOJQJaJhWBCJOJQJaJhK&CJOJQJaJh2~CJOJQJaJ"hRhR56CJOJQJaJhn":56CJOJQJaJhR5CJOJQJaJh,hY(CJOJQJaJdjm} &DEF[]tz{&踬ܸ蝐pܸܬĬh~h3;/6CJOJQJaJh~hTn6CJOJQJaJhRZ6CJOJQJaJhShTnCJOJQJaJh)8CJOJQJaJhN(CJOJQJaJhvCJOJQJaJh,CJOJQJaJhxbmCJOJQJaJh$0cCJOJQJaJhY(CJOJQJaJ'&01JYkl &'2>?MT񩵝ppddXLhFCJOJQJaJhvCJOJQJaJhN(CJOJQJaJhyh$0cCJOJQJaJhyhY(CJOJQJaJhyhwNxCJOJQJaJhaCJOJQJaJhxbmCJOJQJaJhyhTnCJOJQJaJhyhCJOJQJaJhyhyCJOJQJaJhyhcCJOJQJaJhyh3;/CJOJQJaJT]mv)/4^ ZaܸĨ~rfrhaCJOJQJaJhuCJOJQJaJhTn6CJOJQJaJh~h~6CJOJQJaJh6CJOJQJaJh~hTn6CJOJQJaJhOCJOJQJaJhN(CJOJQJaJhvCJOJQJaJhFCJOJQJaJhxbmCJOJQJaJh CJOJQJaJ$a'Hwxy{sdUUh&2hw.CJOJQJaJh&2h,CJOJQJaJh56CJOJQJaJ"h~h56CJOJQJaJhOCJOJQJaJh:CJOJQJaJhRZCJOJQJaJhJ\CJOJQJaJhaCJOJQJaJhN6CJOJQJaJhN(CJOJQJaJhuCJOJQJaJhWBCJOJQJaJ-/>EFQT`bz} <@嬜}n\PhYXCJOJQJaJ"h:h:56CJOJQJaJh#%56CJOJQJaJhahzx5CJOJQJaJh&2h#O]CJOJQJaJhah,5CJOJQJaJh&2h,CJOJQJaJhN(CJOJQJaJh&2hw.CJOJQJaJh&2hCJOJQJaJhaCJOJQJaJh&2hzxCJOJQJaJklmm"n"Q%R%&&&&T+U+V+W+X+Y+j+z++ $$Ifa$gd8 $Ifgd8 $$Ifa$$Ifgd-u)gd5 pgd@Fjkmz (;QVdȸȫȸ~rfZ~rfZf~NBNZhaCJOJQJaJh3rNCJOJQJaJh5CJOJQJaJh5 pCJOJQJaJh CJOJQJaJhxbmCJOJQJaJh5 p56CJOJQJaJ"h~h56CJOJQJaJh`5CJOJQJaJhSh5CJOJQJaJh,5CJOJQJaJh&2h2 5CJOJQJaJhN(CJOJQJaJh&2hYXCJOJQJaJdq y | !#!*!,!2!A!B!H!n!p!t!!!!!!!!!!!!#"("b"n""""""""ܩܩܝܑܝЂh`h5 pCJOJQJaJh [CJOJQJaJh5 pCJOJQJaJhRZh5CJOJQJaJh CJOJQJaJh;CJOJQJaJhaCJOJQJaJh3rNCJOJQJaJh5CJOJQJaJh#%CJOJQJaJ3"####l#r#}#############$$$$$/$3$f$$$$$$$$$$$ %#%,%O%P%Q%R%%%%ĵܵܩ蚵Ў܎܎܂Ўh0#RCJOJQJaJhn.CJOJQJaJhh5CJOJQJaJh;CJOJQJaJhh5 pCJOJQJaJh5CJOJQJaJhnhCJOJQJaJh5 pCJOJQJaJh3rNCJOJQJaJhaCJOJQJaJ/%%/&@&T&`&t&&&&&&&&&&&&&&& '!'ܸr\PDDhqZCJOJQJaJhL(CJOJQJaJ+h#%hg56B*CJOJQJaJph+h#%hGL56B*CJOJQJaJph"h#%h56CJOJQJaJhZD56CJOJQJaJhRZhCJOJQJaJh5 pCJOJQJaJhn.CJOJQJaJh;CJOJQJaJhaCJOJQJaJhnhCJOJQJaJhYXCJOJQJaJ!'9'D'I'J'R'['d'}'''''''''''''''T+U+V+a+j+y+ĸĸĸБxhYh8h8CJOJQJaJh8h-u)5CJOJQJaJh85CJOJQJaJh-u)CJaJh-u)h-u)h-u)CJOJQJaJhX[CJOJQJaJh8CJOJQJaJh CJOJQJaJhqZCJOJQJaJhzCJOJQJaJhZDCJOJQJaJh5CJOJQJaJh/CJOJQJaJy+z+++++++++++++++++(,3,\,c,m,q,,,,,,,,,,,,,,,,,,,,,--F-L------ԹŪԞh-u)5CJaJh-u)CJ\aJh-u)CJaJh8CJOJQJaJh8h-u)CJOJQJaJh8CJOJQJaJh8h8CJOJQJaJh8h-u)5CJOJQJaJh85CJOJQJaJh8h-u)CJOJQJaJ4+++++++++++ $$Ifa$gd8 $Ifgd8 $$Ifa$ +++++, ,,,,^XOOOOOOO $$Ifa$$Ifkd$$IfTlֈ&- '/0M9'' ' ' ' ' 7644 layt-u)T ,,",',(,3,5,7,9,;,=,?,A,C,E,G,I,K,M,O,Q,S,U,W,Y,[,\,Ff $IfFf $$Ifa$\,c,f,j,m,q,t,x,{,,,,,,,,,,,,,,,,,,,Ff; $$Ifa$ $$Ifa$$If,,,,,,,,,,,,,,,,,-------!-%-'-$IfFf $$Ifa$ $$Ifa$'-)-+-.-0-2-4-8-:-<->-A-C-E-F-L-O-T-X-]-a-f-j-o-s-x-|-$IfFf% $$Ifa$ $$Ifa$|------------------------E/F/gd-u)Ff=6$IfFf/ $$Ifa$ $$Ifa$---------D/F/22445577 7777%7z7777778ܵ}tntdt`PhZhZ5CJOJQJaJhZhZhZ>*CJ hZDCJhZhZCJh85CJOJQJhC5CJOJQJhaCJOJQJhZDCJOJQJaJh0#RCJOJQJaJh-u)h0#RCJOJQJaJh8CJOJQJaJhfCJOJQJaJh-u)h-u)CJOJQJaJh-u)h-u)5CJaJh-u)CJaJF/2244777 7 7 7 7 7777777777777777 gdZgd&2gd-u)7777 7!7"7#7$7%7N777788 8!8<8Q8Y8s8t8 $$Ifa$gdX[$If $$Ifa$ $$If^a$gdZ$a$gdZ gdZ8!808<8@8P8Q8Y8t8x8{8|88888899c:d:e:h;i;j;l;c<e<f<g<i<s<<<<񶬢xiYYhX[hZ5CJOJQJaJhX[hX[CJOJQJaJhX[hfCJOJQJaJhX[CJOJQJaJhX[hZCJOJQJaJhX[CJOJQJhZCJOJQJ hZCJhGLCJOJQJaJhX[5CJOJQJaJhZhZ5CJOJQJaJhX[CJOJQJaJhZhZCJOJQJaJ"t8|888888@kd 9$$IflVr$  ($6$ p  \ 0644 lalytX[ $$Ifa$88;9<9=9C9999c:g;h;i;j;e<s<<<<<<<<<<<< $Ifgdf  $If$If<<<<<v====>>>>&?.???@@ @ @ @@@KBfB¸sbVbE h8h#CJOJQJ^JaJhX[CJOJQJaJ h8h3CeCJOJQJ^JaJh3CeCJOJQJaJh8hGLCJOJQJaJh8hWBCJOJQJaJhX[CJOJQJaJh8hZCJOJQJaJhZCJOJQJ hfhfCJOJQJ^JaJhX[CJOJQJ^JaJhfCJOJQJ^JaJ hX[hfCJOJQJ^JaJ<<>?@@@@@hB $Ifgd3Ce $IfgdZ $Ifgdf fBgBhBiBmBBBBBBBCCZCgCzCCCCCC3DqDDDDDʻsg[sggOOChVCJOJQJaJhZDCJOJQJaJhuNCJOJQJaJh;CJOJQJaJhaCJOJQJaJhGLCJOJQJaJh49bhGLCJOJQJaJhB56CJOJQJaJ"h:h:56CJOJQJaJh#%56CJOJQJaJ"h5B*CJOJQJaJph hZCJh3CehZCJOJQJaJh3Ceh3CeCJOJQJaJhBiBjBkBlBmBBFAAAAAgdkd :$$Ifl4gr$  ($6$ p  \ 0644 lalf4ytX[D4EQE[EtEEEEEEEFFFFFG#G4G9GCGUGmGrGGGGGGGGGGGGui]ihZDCJOJQJaJhuNCJOJQJaJh:5CJOJQJaJh49bhCCJOJQJaJhK&CJOJQJaJhaCJOJQJaJh;CJOJQJaJhCCJOJQJaJhqaCJOJQJaJhh:CJOJQJaJhGLCJOJQJaJh0#RCJOJQJaJhVCJOJQJaJ"BEEGG4I5I6IIIKKNNPPQQQQ#Q`RaRbRyRTTgd$0cgd0+gdZDgdCgdgdGLGGVHWHHHHHHHHIII3I4I5I6IAIFITI~IIIIĸĬsfYIfhSh!)5CJOJQJaJhC5CJOJQJaJh$0c5CJOJQJaJhx5CJOJQJaJhSh5CJOJQJaJh5CJOJQJaJhGLhn":CJOJQJaJhZDCJOJQJaJh;CJOJQJaJhaCJOJQJaJhX[CJOJQJaJh8CJOJQJaJhuNCJOJQJaJh [CJOJQJaJIIIIIJJJJJ;JKJmJpJqJJJJJ;KZCJOJQJaJhK&CJOJQJaJh?CJOJQJaJh CJOJQJaJhh$0cCJOJQJaJhCCJOJQJaJhCJOJQJaJ䴳ϴQQɸRRRRRRRRRRSS;܎v^"49$056Cϴ>䴳ϴ䴳ϴ0#䴳ϴG䴳ϴC䴳ϴ":X56Cϴ"::56Cϴ356Cϴ#%56Cϴ&䴳ϴ󴳰䴳ϴ䴳ϴ!689;W?WAWBWCWDWgd TT!T>THTLTTTTTTTTTTTT U5U7UGUOUXUgUUUUUUV8VAVBVMVdVfVpVtVVVVVVVV4W5W6W׿׳קˌ˿˿˿˿˿˿˿˿ˌ˧}h>h= CJOJQJaJh>?䴳ϴ&䴳ϴL(䴳ϴ;䴳ϴ2䴳ϴ?䴳ϴ>䴳ϴ56Cϴ:5Cϴ/679:h= CJOJQJaJhdjhdU 51h0:pg= /!"#$% $$If!vh55 5 5 5 5 #v#v #v #v :V l76,55 5 5 9/ / / / / ayt-u)T$$If!v h555555555 5 5 #v#v#v#v :V l d76, 5555 9 /  / / /  /  /  apdyt-u)Tkd$$IfTl &- "'+/04M9''''''''''' d76,,,,44 lapdyt-u)T$$If!vh5555&55&55&5 5 &5 5 &5 5 5555555#v#v#v#v&#v#v&#v#v&#v #v &#v #v &#v #v #v#v#v#v#v#v#v:V l &־76,5555&55&55&5 5 &5 5 &5 5 55555559/ / / /  ap־yt-u)T#kd$$IfTl&- C A?=;"9$'(+W-/014u6M9''''&''&''&''&''&''''''''' &־76TTTT44 lap־yt-u)T%$$If!vh5555&55&55&5 5 &5 5 &5 5 5555555#v#v#v#v&#v#v&#v#v&#v #v &#v #v &#v #v #v#v#v#v#v#v#v:V l &־76,5555&55&55&5 5 &5 5 &5 5 55555559/  /  / / / / / ap־yt-u)T#kd$$IfTl&- C A?=;"9$'(+W-/014u6M9''''&''&''&''&''&''''''''' &־76TTTT44 lap־yt-u)T%$$If!vh5555&55&55&5 5 &5 5 &5 5 5555555#v#v#v#v&#v#v&#v#v&#v #v &#v #v &#v #v #v#v#v#v#v#v#v:V l &־76,5555&55&55&5 5 &5 5 &5 5 55555559/  / / / / / /  ap־yt-u)T#kd`$$IfTl&- C A?=;"9$'(+W-/014u6M9''''&''&''&''&''&''''''''' &־76TTTT44 lap־yt-u)T$$If!vh5555&55&55&5 5 &5 5 &5 5 5555555#v#v#v#v&#v#v&#v#v&#v #v &#v #v &#v #v #v#v#v#v#v#v#v:V l &־76,5555&55&55&5 5 &5 5 &5 5 55555559/  / / / /  / ap־yt-u)T#kd!$$IfTl&- C A?=;"9$'(+W-/014u6M9''''&''&''&''&''&''''''''' &־76TTTT44 lap־yt-u)T$$If!vh5555&55&55&5 5 &5 5 &5 5 5555555#v#v#v#v&#v#v&#v#v&#v #v &#v #v &#v #v #v#v#v#v#v#v#v:V l &־76,5555&55&55&5 5 &5 5 &5 5 55555559/  / / / /  / ap־yt-u)T#kd*$$IfTl&- C A?=;"9$'(+W-/014u6M9''''&''&''&''&''&''''''''' &־76TTTT44 lap־yt-u)T$$If!v h555555555 5 5 #v#v#v#v :V l d76, 5555 9 /  / /  / /  /  apdyt-u)Tkd(4$$IfTl &- "'+/04M9''''''''''' d76,,,,44 lapdyt-u)T$$Ifl!vh5$ 5p5 5 5\ #v$ #vp#v #v #v\ :V lV065$ 5p5 5 5\ / alytX[$$Ifl!vh5$ 5p5 5 5\ #v$ #vp#v #v #v\ :V l4g065$ 5p5 5 5\ / alf4ytX[@@@ NormalCJ_HaJmH sH tH J@J Z Heading 1$$@&a$CJOJQJaJDAD Default Paragraph FontRi@R  Table Normal4 l4a (k(No List8@8 ZHeader  !aJ4 @4 lFooter  !DO,`aPQCDh\ ] ^ 1M`yz{klmmnQRT#U#V#W#X#Y#j#z################$ $$$$$"$'$($3$5$7$9$;$=$?$A$C$E$G$I$K$M$O$Q$S$U$W$Y$[$\$c$f$j$m$q$t$x${$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$%%%%%%%!%%%'%)%+%.%0%2%4%8%:%<%>%A%C%E%F%L%O%T%X%]%a%f%j%o%s%x%|%%%%%%%%%%%%%%%%%%%%%%%%E'F'**,,/// / / / / /////////////////// /!/"/#/$/%/N////00 0!0<0Q0Y0s0t0|0000000;1<1=1C1111c2g3h3i3j3e4s44444444444446788888h:i:j:k:l:m::==??4A5A6AAACCFFHHIIII#I`JaJbJyJLLL!L6O8O9O;OO?OAOBOEO0000000000 0 0 0 0000000 0 0 0000000000000000000000000000000000000000000000 00000 00 00 00 00000 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0000000000000000000000000000000000000000000/0/ 0/0/0/0/ 0/0/ 0/0/0/ 0/0/0/0/ 0/ 0/0/0/0/0/0/0/0/ 0/ 0/0/0/0/0/ 0/0/0/0/0/0/0/0/0/0/0/0/0/0/ 0/0/0/0/0/0/0/ 0/ 0/0/0/0/0/0/0/0/0/0/0/0/0/0/0/0/0/0/0/0/0/0/0/0/0/0/0/0/0/0/0/0/0/0@000@000@000@00000\ T#U#V#j#z############$ $$$$$"$'$($3$5$7$9$;$=$?$A$C$E$G$I$K$M$O$Q$S$U$W$Y$[$\$c$f$j$m$q$t$x${$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$%%%%%%%!%%%'%)%+%.%0%2%4%8%:%<%>%A%C%E%F%L%O%T%X%]%a%f%j%o%s%x%|%%%%%%%%%%%%%%%%%%%%%%%%*/N////00 0!0<0Q0Y0s0t0|0000000;1<1=1C1111c2g3j3e444467888h:i:EO00*H00)00'0000*Ԓ00(00000000000000000000000000 @0  0000000000000000000000@0 000000000000000000000000000000000000000000@0000000000000000000000000000000000000000000@0000000000000000000000000000000000000000000@00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 0@l0l0 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 0A00000000000000000000000n@3000000000000000000000000000000000000000000@0 00000000000000000000000000 00000000000000@@0@0  g k ~gd&Ta@d"%!'y+-8<fBDGILPyQT6WDW,/012456789:;=>?++,,,'--/788<hBBTDW-3<CDEFGHIKLNOQSU[CW.8@0(  B S  ? (%N (||V(sN(qN(T(MO(N(4$N(l)O(0R(P(TN((ľO(\(XO(sN(lN(tN(M (N!(R"(YO#(ĀN >>B>>>K@K@O@DDD>L>L@M@MDMEO      AEE>>>N@R@R@DDDHLHLCMGMGMEO  =*urn:schemas-microsoft-com:office:smarttags PlaceType9*urn:schemas-microsoft-com:office:smarttagsplace=*urn:schemas-microsoft-com:office:smarttags PlaceName;*urn:schemas-microsoft-com:office:smarttagsaddress:*urn:schemas-microsoft-com:office:smarttagsStreet9*urn:schemas-microsoft-com:office:smarttagsState X5O6O6O8O8O9O9O;OO?OAOBOEO5O6O6O8O8O9O9O;OO?OAOBOEOP0P0v0x011d2e2j3l3f4g444668 8m:m:@@5OEO5O6O6O8O8O9O9O;OO?OAOBOEO=H7 ;Qvw ETz Th^`OJQJo(hHh^`OJQJ^Jo(hHohpp^p`OJQJo(hHh@ @ ^@ `OJQJo(hHh^`OJQJ^Jo(hHoh^`OJQJo(hHh^`OJQJo(hHh^`OJQJ^Jo(hHohPP^P`OJQJo(hH^`o(. ^`hH. pLp^p`LhH. @ @ ^@ `hH. ^`hH. L^`LhH. ^`hH. ^`hH. PLP^P`LhH.^`o(. ^`hH. pLp^p`LhH. @ @ ^@ `hH. ^`hH. L^`LhH. ^`hH. ^`hH. PLP^P`LhH.h^`OJQJo(hHh^`OJQJ^Jo(hHohpp^p`OJQJo(hHh@ @ ^@ `OJQJo(hHh^`OJQJ^Jo(hHoh^`OJQJo(hHh^`OJQJo(hHh^`OJQJ^Jo(hHohPP^P`OJQJo(hHETz;Qvw=                                    }@AzY|uaP] h= V h $ ]   x"OZ/V #%%!"#8$o$Q%)%j4&L(-u)0+=+![,J-3;/U&0p1&2U06.16z6z8:n":;?<?C^3GIG]G}ImKGLRN3rNuNSgQ0#R[R)S2W?Z@>ZRZqZ [#O]n_`qa49bVb$0c3Ce3XefZfeklxbm n0nTnZo5 pq>rUsJstbuwNxzx&'|3;}2~J\x")8vY(Y3EZDD{'Rob&u:~ D|V-nhvX[E  U} B n.~O]i{K&C-c?YXz*o;#Qp5j`$%k!)y8O d'_t)`og.,Dz$0~hnw.TYcEm>c#N(eXCPgCN68xF%a'5G$t2 5SWBU#V#z#########$ $$$$$"$'$($3$5$7$9$;$=$?$A$C$E$G$I$K$M$O$Q$S$U$W$Y$[$\$c$f$j$m$q$t$x${$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$%%%%%%%!%%%'%)%+%.%0%2%4%8%:%<%>%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%/0<0Y0|0001c2e46h:i:EO@Ĥ*DOP@UnknownGz Times New Roman5Symbol3& z Arial7&  Verdana;Wingdings?5 z Courier New"1hwʷfwʷfjf dC( dC(!4dOO 2QHX?2*To: John Nixon, VP, Instructional Services jblake-judd ddistante    Oh+'0  , L X d p|,To: John Nixon, VP, Instructional Services jblake-judd Normal.dot ddistante2Microsoft Office Word@@@@J@J dC՜.+,0( hp  91ֱ(O +To: John Nixon, VP, Instructional Services Title  !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^`abcdefghijklmnopqrstuvwxyz{|~Root Entry FData _:1Table}lWordDocument7SummaryInformation(DocumentSummaryInformation8CompObjq  FMicrosoft Office Word Document MSWordDocWord.Document.89q