Technology Usage Plan – EdTech 501

The Technology Usage Plan for EdTech 501 is the artifact I’ve chosen for this standard. This was created by myself, Glynda Pflieger, and Nancy O’Sullivan. In it we analyzed why we felt a need for a technology plan, the vision for our students and institution, and the specific process we’d use to develop the plan. Building a team to help formulate the plan meant we’d need to involve people from various sections of the staff, and some strategic members of the community who could help us meet our goals. That team would then assess the equipment and training needs of our staff, building, and students. Once the needs are clearer, we would set forth objectives and a timeline to that guided their completion. The plan also calls for the team to come up with some benchmarks that will gauge the success for all parts of the plan. I think it is a very comprehensive plan. By representation from staff at various levels, we wanted to achieve consensus that everyone in the building could support. The plan would surely be considered long range. Just forming the committee could take weeks knowing the staff in my building. One of the major drawbacks I see with creating a technology plan is the speed of technology evolution. If we had started to implement this plan at my school four years ago, many of the hardware and connectivity requirements could have been outdated by now. However, almost anything would have been an improvement. Despite the constant changes, I do feel like this plan solidly meets the standard for strategic long-range planning.

Icebreaker Activity Synchronous Evaluation Tool – EdTech 523

My Synchronous Evaluation Checklist of an online lesson for EdTech 523 is the artifact I’ve chosen for this standard.  We all submitted icebreakers that we could use at the beginning of an online course to help develop a sense of community.  We were given a rubric that evaluated the classmate’s activity on eight criteria.  The formative evaluation was based on our ranking of criteria of Not Observed, Basic, or Proficient.  If too many of the categories were marked as Not Observed, then the project would be given a failing grade. Scores of Basic would show that some of the elements were in place, but not developed enough to be considered challenging or beyond rudimentary.  Only those that received consistent marks of Proficient should be seen as highly developed and not in need of further adjustment.  Depending on what ratings one was given, they should know which areas of the assignment needed to be addressed.

I feel that summative evaluation was completed at the same time by providing comments in each appropriate box (those comments could be seen as formative as well).  By reading the comments, a reviewer could judge whether or not the icebreaker activity was developed enough or suitable for a specific online course.  The criticisms could provide an instructor with information they’d need to decide if it was something they wanted to use or not.

Livability WebQuest Rubric – EdTech 502

I created two rubrics for the Livability Webquest that was assigned in EdTech 502. Students could have completed the assignment with a pictorama or synthesis paper. In either case they were provided with a rubric to self-evaluate during the creation of their item. The same rubric was also used for grading by a peer group and the teacher. I worried that the students who peer reviewed the products would not follow the guidelines of the scoring because they didn’t want to be hard on classmates, and in some cases that was true. But as some research has found, in most cases peers do a reliable job of rating fellow students with the use of a rubric (Hafner & Hafner, 2003). I really wasn’t relying on the students to do the grading for me though. The main purpose was for them to be able to see what standard they should be aiming for depending on what grade they wanted to get. Also, I felt the peer review process provided students with a chance to critically evaluate another’s work with the hopes of improving the ability to assess their own. By providing them with a rubric, I met the standard for providing criterion-based measurement of student mastery of content.

Peer Review Screencast – EdTech 543

The Peer Review Screencast for EdTech 543 gave me the chance to go through what my students have. We had created a MOOC for the main course project, and were then supposed to share it with a classmate for evaluation based on a rubric provided to us before we began the project. There have been many times I’ve had students in my classes evaluate each other based on a rubric, and then wondered how they could have given such a high score. They always tell me they feel bad giving a low score, or are afraid others will give a bad grade if they do. When I did the peer review of the MOOC, I have to admit I felt a little bit apprehensive about being totally honest in the video. But I got over it, knowing that the whole thing is only valid if we’re able to be truthful in a constructive way. The unit I reviewed had some trouble with the use of folders in Edmodo. My classmate had trouble making the folders for her three units, and when I went to the site could not access her documents. I told her about it and she still wasn’t able to get it done correctly. I understand that sometimes we just can’t figure out how to do some things, but I had to grade her down on that section of the rubric because if students had tried to access that it would have posed a problem for them. I thought the activity was great when done as a screencast. When that technology becomes available at my school for the students, I will definitely be doing it. Having provided two examples of using rubrics to gauge content mastery, from both sides of the equation, shows my completion of the standard.

Technology Maturity Benchmarks Evaluation – EdTech 501

I remember feeling like a true intellectual when I was doing the research for the Technology Maturity Benchmarks project in EdTech 501.  I had never really thought much about most of the data I collected for the assignment.  The facts related to my middle school, but I changed the name for the project.  As the title states, we gathered information to evaluate the technology maturity of an institution.  This meant, to what level has ‘Northside Middle School’ involved itself in the use of technology at various levels (administration, curriculum, support, connectivity, and innovation).  First I described our clientele in terms of racial and financial backgrounds.  Being able to understand what types of backgrounds our student clientele has can help understand what problems we may face in meeting their technology needs.  Each of the five above mentioned categories were then broken down into subcategories.  Those got a rating of island (almost no evidence of technology use), emergent (some signs of use, but underdeveloped), integrated (there is use of technology, but not at optimal levels), and intellectual (the technology is a fully functioning part what the school does).  Also, ratings focused on the infrastructure that was in place as well as the behavior/attitudes of employees and students.  Northside didn’t score intellectual on any part of the assessment.  In many areas there was no evidence of a plan to integrate technology into our organization or teaching.  It wasn’t even welcomed by many staff.  I identified that our curriculum used little to no modern technology at all.  Some teachers were still using slide projectors.  LCD projectors were not in all rooms yet.  We didn’t have WiFi.  Our computers were well over five years old.  Our attitudes were even older.  By completing this evaluation I was able to recognize the deficiencies of my organization.  Today, the administration is willing to let me encourage staff wide use of Google Docs to save money on copies for staff paperwork and to help us connect to our clients outside the school day.  I’ve assisted many staff in creating class websites, and all the social studies teachers in the district share lessons and materials via Google Drive.  All this started when I realized how poorly we scored on the technology maturity benchmarks assignment.  I know that doing these types of evaluations is something I’m capable of doing and can use to gather data for future problem solving.

 

Northside Middle School Technology Benchmarks Evaluation

This technology evaluation is for Northside Middle School.  It is in a district that is 67% Caucasian, 29% Hispanic, 2% Native American, and 1% each of African American and Asian.  In our student body, 12% have limited English skills.  Students eligible for free or reduced lunches sits at 58%.  The following is my evaluation of how Northside ranks on the technology maturity benchmarks.

 ADMINISTRATIVE

Policy

behavioral – Emergent or Island (I see no hard evidence of a plan, hadn’t cared until now)

resource/infrastructure – Island (Have seen no policy or TUP, could not locate one)

Planning

behavioral – Island (Haven’t seen a plan, but some of the things done would have required  one)

resource/infrastructure – Island (Again, not much evidence of planning, I have never heard of teacher involvement on a committee about technology)

Budget

behavioral – Island (teachers usually fundraise to get equipment, or buy it with district monies)

resource/infrastructure – Island (I have not seen a copy of our tech budget.  There have been some moderate improvements made, but nothing extensive that would allow for total integration of tech into the curriculum)

Administrative Information

behavioral -Integrated (We all have access to some level of technology, but still rely mostly on methods available 15 years ago)

resource/infrastructure – Integrated (We all have access to some level of technology, but still rely mostly on methods available 15 years ago)

CURRICULAR

Electronic Information

behavioral – Island (Most instruction is done with traditional methods like handouts and texts, but websites for research are sometimes used.  It depends on when you can get a lab date)

resource/infrastructure – Island (One of the biggest barriers to student use of technology is that computers are not available except in the lab.  Some, but not most teachers may have a bank of 1-4 computers in their room)

Assessment

behavioral – Island (Most assessments are done on paper.  PowerPoints are occasionally used.  I used to feel I was doing a good job of putting the test questions on the screen via projector)

resource/infrastructure – Island (We just this year got a set of test clickers)

Curricular Integration

behavioral – Emergent (Almost none of our curriculum depends on technology to make it work.  A teacher could go all year in our curriculum without using technology)

resource/infrastructure – Island (We have some resources available, especially math and science, but they are limited)

Teacher Use

behavioral – Emergent (The computer lab reservation sheet is dominated by 2-5 teachers throughout the year.  Science, math, and Language Arts use it the most, but it totally depends on the style of the teacher)

resource/infrastructure – Integrated (The lack of use isn’t because we have no access.  Some of the problems lie with teachers fear of blocked sites stalling a unit or lesson)

Student Use

behavioral – Emergent (Students are increasingly using technology to enhance their education.  However, if a student has no access at home, they probably are behind in their ability to use technology while at school)

resource/infrastructure – Island

SUPPORT

Stakeholder Involvement

behavioral – Emergent (I classified this as emergent because I would be considered a stakeholder, and have never heard of meetings on planning in technology, but have been in many planning meetings regarding other areas of school improvement)

resource/infrastructure – Emergent (From what I could gather, one or two administrators and the district IT leader make most decisions)

Administrative Support

behavioral – Island (With standardized testing, budget concerns, and parent interaction, I feel they have little time for this area)

resource/infrastructure – Island (Little measurable action in this area)

Training

behavioral – Emergent (Most professional development is focused on teaching methods not involving technology)

resource/infrastructure – Emergent (Any training we get that involves technology is ancillary, and the ideas being taught are more important than the technology used to deliver it)

Technical/Infrastructure Support

behavioral – Emergent (The small number of staff who want to use technology in their classes seek out information to help them achieve their goals)

resource/infrastructure – Island (We have good IT support.  The main problem I’ve faced is that getting many sites unblocked can be annoying and it feels like you are making your case to a jury)

CONNECTIVITY

Local Area Networking (LAN)

behavioral – Island (Each student has their own spot on the network that they can save work to, but I feel they rarely get a chance to use computers for assignments.  For most students, there is no way for them to connect through the network)

resource/infrastructure – Island (The only places the network can be accessed is the three computer labs, four machines in the library, or any units in classrooms.  Most classes have no student-accessible computers)

District Area Networking (WAN)

behavioral – Emergent (Each teacher has an in/outbox on the network, but I use mine once to twice a year.  If students are rarely able to access the school’s network while at school, why should I bother to put work there or force them to turn in work to my inbox.  Recently when I tried to have them turn in work to my inbox, they were blocked.  I instead had them upload work to edmodo.com)

resource/infrastructure – Island (Again, with little access to lab time or in room computer banks, the small network resources we have don’t mean much practically)

Internet Access

behavioral – Emergent (All teachers have access, students have limited access.  Teachers mostly use computers to record attendance, keep grades, and do research.  I feel it is rarely used as a tool in the classes like a textbook)

resource/infrastructure – Integrated (We have good access for teachers, but could have more for students)

Communication Systems

behavioral – Island (Email is used for communication, unfortunately they send out the same items on paper as well, not trusting that emails get read)

resource/infrastructure – Island (Almost all students have no access because it is blocked.  I recently got access for my classes through gaggle.net. I had to get special permission from our IT leader at the district level.  I found that only two teachers in our district use this service)

INNOVATION

New Technologies

behavioral – Island (I would give us island status since we don’t reject the new technologies, we just decide it isn’t worth the time and effort to adapt and integrate them into curriculum)

resource/infrastructure – Island (One reason I believe new ideas aren’t implemented is that we get little to no professional development in the technology area.  If it isn’t important to the district to show us how to use things, then they must not be good enough for me to learn on my own)

Comprehensive Technologies

behavioral – Island (All rooms now have projectors connected to our computers.  However, I know some teachers who use them rarely because they know replacement bulbs are expensive)

resource/infrastructure – Island (The school has two digital cameras that can be checked out by teachers.  Teachers may use state allotted money to by a scanner, but none are given by the school.  Next year, due to budget cuts, teachers will receive not state money)

Overall, I would rank our school in the islands range.  That is where we stood in the majority of categories, and it really sums up our overall attitude towards technology integration.  We have the tools, but they don’t work well with each other or our personnel and students.  Implementation could be much higher and better functioning.  Not much of what we do could be considered integrated, and most is not intelligent in design.