ࡱ> jlkzakW7{y3T( ;( / 00DArialngsRomanD t0tDVerdanasRomanD t0t" DTimes New RomanD t0t0DWingdingsRomanD t0t A .  @n?" dd@  @@`` 8&         !"#,r$kW7{y3Ti 0AA@8Q ʚ;i8ʚ;g4YdYdL6 0ppp@ <4ddddA10|D  <4ddddc20<4!d!dc2080___PPT10 ?- February 1, 2008 dRetreat on StudentLlearning and Assessment, IrvineO  =40Integrating Student Learning into Program Review>Barbara Wright Associate Director, WASC bwright@wascsenior.org*?(2Assessment & Program Review: related but different33"Program review typically emphasizes Inputs, e.g. Mission statement, program goals Faculty, their qualifications Students, enrollment levels, qualifications Library, labs, technology, other resources Financial support P$ $9Assessment & Program Review: related but different, cont.::"Program review typically emphasizes Processes, e.g. Faculty governance Curriculum review Planning Follow-up on graduates Budgeting And yes, assessment may be one of these X$w$ w 9Assessment & Program Review: related but different, cont.::"4Program review typically emphasizes indirect indicators of student learning and academic quality, e.g. Descriptive data Surveys of various constituencies Existence of relationships, e.g. with area businesses, professional community Program review has traditionally neglected actual student learning outcomes$ZCZZLZ$0!     !   9Assessment & Program Review: related but different, cont.::"bPR is typically conceived as Data-gathering Looking at the past 5-8 years Reporting after the fact where the program has been Using PR to garner resources  or at least protect what program has Projecting needs into the future Expressing  quality &  improvement in terms of a case for additional inputs*ZZ 2Capacity vs. Educational Effectivess for Programs:33" 5Capacity questions: What does the program have in the way of inputs, processes, and evidence of outputs or outcomes? What does it need, and how will it get what it needs? EE questions: How effectively do the inputs and processes contribute to desired outcomes? How good are the outputs? The student learning? f6Z !}C% 2Assessment & Program Review: related but different33"Assessment is all about Student learning & improvement at individual, program & institutional levels Articulation of specific learning goals (as opposed to program goals, e.g.  We will place 90% of graduates in their field. ) Gathering of direct, authentic evidence of learning (as opposed to indirect evidence, descriptive data):2f :Assessment & Program Review: related but different, cont. ;;"Assessment is all about Interpretation & use of findings to improve learning & thus strengthen programs (as opposed to reporting of data to improve inputs) A future orientation: Here s where we are  and here s where we want to go in student learning over the next 3-5 years Understanding the learning  problem before reaching for a  solution LZAZZ` : Assessment & Program Review: related but different, cont.;;"nAssessment of student learning and program review are not the same thing. However, there is a place for assessment as a necessary and significant input in program review. We should look for A well-functioning process Key learning goals Standards for student performance A critical mass of faculty (and students) involved Verifiable results, and Institutional supportDPPx'    rDon t confuse program-level assessment and program review::"Program-level assessment means we look at learning on the program level (not just individual student or course level) and ask what all the learning experiences of a program add up to, at what standard of performance (results). Program review looks for program-level assessment of student learning but goes beyond it, examining other components of the program (mission, faculty, facilities, demand, etc.)zZk' What does WASC want? Both!Systematic, periodic program review, including a focus on student learning results as well as other areas (inputs, processes, products, relationships) An improvement-oriented student learning assessment process as a routine part of the program s functioningZXInstitutionalizing Assessment  2 aspects: --"The PLAN for assessment (i.e. shared definition of the process, purpose, values, vocabulary, communication, use of findings) The STRUCTURES and RESOURCES that make the plan doable(}7How to institutionalize --Make assessment a freestanding function Attach to an existing function, e.g. Accreditation Academic program review Annual reporting process Center for Teaching Excellence Institutional Research &MvMvMake assessment freestanding -- "9Maximum flexibility Minimum threat, upset A way to start `Little impact Little sustainability Requires formalization eventually, e.g. Office of Assessment-Attach to Office of Institutional Research --.."Strong data gathering and analysis capabilities Responds to external expectations Clear responsibility IR has resources Faculty not  burdened  Perception: assessment = data gathering Faculty see little or no responsibility Faculty uninterested in reports Little or no use of findings +Attach to Center for Teaching Excellence --,,"Strong impact possible Ongoing, supported Direct connection to faculty, classroom, learning Chance for maximum responsiveness to  use phase Impact depends on how broadly assessment is done No enforcement Little/no reporting, communicating Rewards, recognition vary, may be lip service Attach to annual report --Some impact (depending on stakes) Ongoing Some compliance Habit, expectation Closer connection to classroom, learning Cause/effect possible Allows flexibility Impact depends on how seriously, how well AR is done No resources Reporting, not improving, unless specified Chair writes; faculty involvement varies Attach to accreditation --cMaximum motivation Likely compliance Resources available Staff, faculty assigned Clear cause/effectdPd Resentment of external pressure Us/them dynamic Episodic, not ongoing Reporting, gaming, not improving Little faculty involvement Little connection to the classroom, learning Main focus: inputs, processZAttach to program review --vSome impact (depending on stakes) Some compliance Some resources available Staff, faculty assigned Cause/effect varieswZw Impact depends on how seriously, how well PR is done Episodic, not ongoing Inputs, not outcomes Reporting, not improving Generally low faculty involvement Anxiety, risk-aversion Weak connection to the classroom, learning Z(How can we deal with the disadvantages?))"Strong message from administration: PR is serious, has consequences (bad and good) Provide attentive, supportive oversight Redesign PR to be continuous Increase weighting of assessment in overall PR process increase Involve more faculty, stay close to classroom, program Focus on outcomes, reflection, USE Focus on improvement (not just  good news ) and REWARD IT ,pZICHow can we increase weighting of learning & assessment in PR? E.g.,DD"XOptional part One small part of total PR process  Assessment vague, left to program Various PR elements of equal value (or no value indicated) Little faculty involvementZ Required Core of the process (so defined in instructions) Assessment expectations defined Points assigned to PR elements; student learning gets 50% or more Broad involvementZ 0Assessment serves improvement and accountability*1"""ZA well-functioning assessment effort systematically improves curriculum, pedagogy, and student learning; this effect is documented. At the same time, The presence of an assessment effort is an important input & indicator of quality, The report on beneficial effects of assessment serves accountability; and Assessment findings support $ requestsdZZ4ZLg!New approaches to PR/assessment "Create a program portfolio Keep program data continuously updated Do assessment on annual cycle Enter assessment findings, uses, by semester or annually For periodic PR, review portfolio and write reflective essay on student AND faculty learningZ  0` !3̙` Q.<ffff3` 3333fff` 3K=̙fff` 3fffff` ff3ff3` aNR>ff` 3fY33` 3f3f>?" dd@&?oAd(@n<)o<6=nA+7%Z', n?" dd@   @@``PR   = 7 ,`(p>> D<P(    <(ZH #" `j H T Click to edit Master title style! !$  0]H "Pe H RClick to edit Master text styles Second level Third level Fourth level Fifth level!     S^B  # $lHBCE6FGIIQU*VWX @`@$ G"  00`B  s *Dg "00  0`H "^` H B*   0dH "^  H D*   0{H "^  H D*   dA"޽h ?Light horizontal"` 3f3f___PPT10i. "+D=' = @B + Profile~ 0 `(    <L{ #" `p  T Click to edit Master title style! !  0  "p`   W#Click to edit Master subtitle style$ $  0Č "``  B*   0@ "`   D*   0ܒ "`   D* ^B  # BCE6FGjIQU*VWX @`@$ G") 00  dA"޽h ?Light horizontal"` 3f3f___PPT10i. "+D=' = @B +0 zr0  (     0 P    P*    0     R*  d  c $ ?    0  0  RClick to edit Master text styles Second level Third level Fourth level Fifth level!     S  6 _P   P*    6d/  _   R*  H  0޽h ? 3380___PPT10.\00  0(  x  c $p  x  c $p`   H  0޽h ? 3380___PPT10.6 0  6(   x  c $] j   ~  s *`^ e   H  0޽h ? 3f3f80___PPT10.PIa\0 0 $0(  $x $ c $7 j   x $ c $8 Pe   H $ 0޽h ? 3f3f80___PPT10.<0 0 (0(  (x ( c $4r j   x ( c $ s e   H ( 0޽h ? 3f3f80___PPT10.<0 0 00(  0x 0 c $Ȉ j   x 0 c $ Pe   H 0 0޽h ? 3f3f80___PPT10.[g0 0 40(  4x 4 c $` j   x 4 c $8 Pe   H 4 0޽h ? 3f3f80___PPT10.{qI0 0 80(  8x 8 c $ j   x 8 c $( Pe   H 8 0޽h ? 3f3f80___PPT10.(PR0 0 <0(  <x < c $xǝ j   x < c $˝ Pe   H < 0޽h ? 3f3f80___PPT10.L"0 0 0L0(  Lx L c $Xݝ j   x L c $0ޝ P`@   H L 0޽h ? 3f3f80___PPT10.nP0  @7(  @ @ 0 / N1. Goals, questions0 @ 0 0 6 M2. Gathering evidence0R @ s *@X @ 0G+ 0 dr @ <UKHzJI`p` X @ 0nT  @ 0  @= @ I3. Interpretation0  @ < fA >4. Use0  @ 0D KThe Assessment Loop0(H @ 0޽h ? ˽"i6ffff0 x D(  D D 0t/ ;1. Does the program have student learning goals, questions?<<0#  D 00  Q2. Do they have methods, processes for gathering evidence? Do they have evidence?<R0CR D s *@X D 0G+ 0 dr D <UKHzJI`p` X D 0nT  D 04 = z ^3. Do they have a process for systematic, collective analysis and interpretation of evidence?<_0P8  D 0"PA  ~4. Is there a process for use of findings for improvement? Is there admin. support, planning, budgeting? Rewards for faculty? >0x  D 0* PThe Assessment Loop  Capacity Questions)0)(H D 0޽h ? ˽"i6ffff0 6.  H(  H  H 0x/ Q1. How well do they achieve their student learning goals, answer questions?<R0F H 0p0  _2. How aligned are the methods? How effective are the processes? How complete is the evidence?&`0]R H s *@X H 0G+ 0 dr H <UKHzJI`p` X H 0nT 0 H 0; = ' x3. How well do processes for systematic, collective analysis and interpretation of evidence work? What have they found?<y0 j  H 0PAJ  P4. What is the quality of follow-through on findings for improve-ment? Is there improvement? How adequate, effective are admin. support, planning, budgeting? Rewards for faculty? ^0(jAp  H 0ȵ XThe Assessment Loop  Effectiveness Questions*-0,((H H 0޽h ? ˽"i6ffff0 0 @P0(  Px P c $h?j   x P c $8Pe  H P 0޽h ? 3f3f80___PPT10.0$ 0 PT$(  Tr T S Nj   r T S ,:e  H T 0޽h ? 3f3f80___PPT10.b0 0 `0(  `x ` c $\j   x ` c $\Pe  H ` 0޽h ? 3f3f80___PPT10.(0 0 d0(  dx d c $$bj   x d c $bPe  H d 0޽h ? 3f3f80___PPT10.p  0 h(  hx h c $ij   x h c $je   x h c $\km    h 0pP0 *  h 0Ls o=Positives and Negatives> >H h 0޽h ? 3f3f80___PPT10.He 0 l(  lx l c $D~j   x l c $e   x l c $m    l 0P0 *  l 0 o=Positives and Negatives> >H l 0޽h ? 3f3f80___PPT10.He 0 p(  px p c $Xj   x p c $0e   x p c $m    p 0dP0 *  p 0l o=Positives and Negatives> >H p 0޽h ? 3f3f80___PPT10.He 0 t(  tx t c $Őj   x t c $0ʐe   x t c $ːm    t 0͐P0 *  t 0lА o=Positives and Negatives> >H t 0޽h ? 3f3f80___PPT10.He 0 x(  xx x c $ j   x x c $e   x x c $m    x 0P0 *  x 0 o=Positives and Negatives> >H x 0޽h ? 3f3f80___PPT10.He 0 |(  |x | c $< j   x | c $ e   x | c $ m    | 0PP0 *  | 0, o=Positives and Negatives> >H | 0޽h ? 3f3f80___PPT10.He0 0 0(  x  c $8,j   x  c $- 0  H  0޽h ? 3f3f80___PPT10.!<} 0 }(  x  c $D=j   x  c $>(   x  c $>m     0@@P7 e3From to 4 4H  0޽h ? 3f3f80___PPT10.`K0 0  0(  x  c $Zj   x  c $_Pe  H  0޽h ? 3f3f80___PPT10. (0 0 00(  x  c $dnj`   x  c $(oPe  H  0޽h ? 3f3f80___PPT10.A@5WUr%U#O;G]WY[ ^C`{bd#i"ovf$\}"ZߦO!1Oh+'0 hp    $084Integrating Student Learning into Program Review Profile jkotovsky6Microsoft Office PowerPoint@@@p(QYGg  B7  y-B( --$xx--'@"Verdana-. 2 pFebruary 1, 2008."SystemXX-@"Verdana-. 72 p9 Retreat on StudentLlearning and .-@"Verdana-. "2 tBAssessment, Irvine.-@"Verdana-.  2 p1.---$ ) +`+`) )----% ))--'@"Verdana-. 32 Integrating Student Learning n.-@"Verdana-. $2 'into Program Reviewo .-@"Verdana-. 2 CBarbara Wright.-@"Verdana-. +2 KAssociate Director, WASC.-@"Verdana-. 3f(2 Sbwright@wascsenior.org.-Root EntrydO)tb|@PicturesCurrent User8SummaryInformation(^  !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]_`abcde}~i  accountability New approaches to PR/assessment  Fonts UsedDesign Template Slide Titles4 $,  _۫CathleenCathleenRoot EntrydO)ztz|@PicturesCurrent User5SummaryInformation(^  !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]_`abcde}~m  accountability New approaches to PR/assessment  Fonts UsedDesign Template Slide Titles4 $, _۫PNaydeenNaydeenn!_۫ wjkotovskyjkotovsky  !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]_`abcdeghijklmopqrstuxRoot EntrydO)PicturesCurrent UsernSummaryInformation(^PowerPoint Document(DocumentSummaryInformation8fRoot EntrydO) kJ\|PicturesCurrent UsernSummaryInformation(^  !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]_`abcdeopqrstu}~ ՜.+,D՜.+,P    On-screen Show  ArialVerdanaTimes New Roman WingdingsProfile1Integrating Student Learning into Program Review3Assessment & Program Review: related but different:Assessment & Program Review: related but different, cont.:Assessment & Program Review: related but different, cont.:Assessment & Program Review: related but different, cont.3Capacity vs. Educational Effectivess for Programs:3Assessment & Program Review: related but different;Assessment & Program Review: related but different, cont. ; Assessment & Program Review: related but different, cont. Slide 10 Slide 11 Slide 12:Dont confuse program-level assessment and program reviewWhat does WASC want? Both!-Institutionalizing Assessment 2 aspects: How to institutionalize -- Make assessment freestanding --.Attach to Office of Institutional Research --,Attach to Center for Teaching Excellence --Attach to annual report --Attach to accreditation --Attach to program review --)How can we deal with the disadvantages?DHow can we increase weighting of learning & assessment in PR? E.g.,1Assessment serves improvement and accountability New approaches to PR/assessment  Fonts UsedDesign Template Slide Titles4 $, PowerPoint Document(DocumentSummaryInformation8