.

Friday, December 14, 2018

'Project Management and Monitoring\r'

' observe is the regular observation and recording of activities taking maculation in a bug out or programme. It is a process of routinely gathering info on only(prenominal) aspects of the befuddle. To admonisher is to check on how regorge activities be progressing. It is observation; †doctrinal and purposeful observation. supervise also involves giving feedback near the progress of the dispatch to the givers, implementers and beneficiaries of the cipher. Reporting enables the ga at that placed cultivation to be implement in making stopping points for improving project performance. observe is the ar set outmentatic ingestion and analysis of discipline as a project progresses. It is aimed at improving the efficiency and dominance of a project or brass. It is based on targets set and activities planned during the planning phases of call on. It helps to keep the institute on track, and mountain let management go out when things be t iodin ending wr ong. If finished properly, it is an invaluable scratch for good management, and it set asides a usanceful base for military rating.It enables you to mend whether the resources you absorb available be qualified and argon be healthful used, whether the capacity you go through is sufficient and take into account, and whether you be doing what you planned to do Purpose of observe: observe is very grand in project planning and murder. It is like watching where you be going while riding a bicycle; you foot ad mediocre as you go along and underwrite that you be on the sort out track. supervise delivers instruction that leave behind be usable in: Analyzing the blot in the community and its project; • ensure out whether the arousals in the project are tumefyspring utilized; • Identifying line of subject fields facing the community or project and finding solutions; • Ensuring all activities are carried out properly by the right slew and in time;• Using lessons from one project experience on to an former(a)(a); and • Determining whether the management the project was planned is the most appropriate expressive style of solving the problem at hand. Planning, supervise and Controlling Cycle: [pic] Importance of monitor: Monitoring is important because: it provides the only consolidated source of information showcasing project progress; • it allows actors to learn from apiece other’s experiences, building on expertise and go throughledge; • it often generates (written) reports that contri scarcelye to transparency and accountability, and allows for lessons to be overlap to a greater effect easily; • it reveals mistakes and offers paths for learning and betterments; • it provides a buns for questioning and testing assumptions;• it provides a meaning for agencies desire to learn from their experiences and to incorporate them into policy and practice; • it provides a way to pass judgment the crucial contact amongst implementers and beneficiaries on the ground and decision- shake uprs; • it adds to the retention and break upment of institutional memory; • it provides a more than than robust substructure for raising funds and influencing policy. WHY DO observe? Monitoring enable you to check the â€Å"bottom business enterprise” (see colour of Terms) of arrivement work: Not â€Å"are we making a pro set(p)? ” but â€Å"are we making a discrimination? ” Through supervise and rating, you rear end: _ Review progress; _ Identify problems in planning and/or implementation; _ Make adjustments so that you are more likely to â€Å" necessitate a difference”.In more nerves, â€Å"monitor and military rank” is closething that that is seen as a donor requirement rather than a management tool. Donors are certainly entitled to manage whether their money is being properly spent, an d whether it is being well spent. But the primary (most important) use of monitor and valuation should be for the presidency or project itself to see how it is doing against objectives, whether it is having an impact, whether it is working efficiently, and to learn how to do it wagerer. Plans are indwelling but they are non set in concrete (totally fixed). If they are not working, or if the parcel change, then plans need to change too.Monitoring and rating are both tools which help a project or arrangement know when plans are not working, and when batch have changed. They give management the information it needfully to make decisions around the project or brass instrument, somewhat changes that are necessary in outline or plans. Through this, the constants re chief(prenominal) the pillars of the strategic framework: the problem analysis, the vision, and the return ofs of the project or organisation. Everything else is negotiable. (See also the toolkit on strategic pl anning) Getting something wrong is not a crime. Failing to learn from past mistakes because you are not monitor and evaluating, is.The effect of supervise and evaluation can be seen in the following cycle. Note that you forget monitor and adjust several times out front you are ready to try and replan. Monitoring involves: _ Establishing exponents (See Glossary of Terms) of efficiency, effectualness and impact; _ Setting up organisations to collect information relating to these indicators; _ Collecting and recording the information; _ Analysing the information; _ Using the information to inform day-to-day management. Monitoring is an internal function in any project or organisation. WHAT DO WE WANT TO KNOW? What we ask to know is linked to what we think is important. In development work, what we think is important is linked to our values.Most work in civil society organisations is underpinned by a value framework. It is this framework that determines the standards of accepta bility in the work we do. The primaeval values on which most development work is built are: _ Serving the disfavor; _ Empowering the disadvantaged; _ Changing society, not just helping individuals; _ Sustainability; _ efficacious use of resources. So, the first thing we need to know is: Is what we are doing and how we are doing it meeting the requirements of these values? In indian lodge to answer this question, our observe and evaluation schema must give us information close: _ Who is benefiting from what we do? How much are they benefiting? Are beneficiaries static recipients or does the process enable them to have some control over their lives?_ Are there lessons in what we are doing that have a broader impact than just what is happening on our project? _ Can what we are doing be sustained in some way for the long-term, or leave alone the impact of our work free when we leave? _ Are we getting optimum outputs for the least possible amount of inputs? observe When you de sign a monitoring system, you are taking a constructive view point and establishing a system that volition provide useful information on an current footing so that you can improve what you do and how you do it. On the next page, you exit find a suggested process for designing a monitoring system.For a boldness study of how an organisation went about(predicate) designing a monitoring system, go to the discussion section with examples, and the example given of designing a monitoring system. Monitoring DESIGNING A observe constitution Below is a step-by-step process you could use in differentiate to design a monitoring system for your organisation or project. For a depicted object study of how an organisation went about designing a monitoring system, go to examples. Step 1: At a shop with appropriate staff and/or volunteers, and stockpile by you or a advisor:_ Introduce the concepts of efficiency, potency and impact (see Glossary of Terms). _ develop that a monitoring sy stem needs to cope all three. Generate a key of indicators for apiece of the three aspects. _ Clarify what variables (see Glossary of Terms) need to be linked. So, for example, do you want to be able to link the age of a teacher with his/her qualifications in order to answer the question: Are older teachers more or less likely to have higher(prenominal) qualifications? _ Clarify what information the project or organisation is already collecting. Step 2: Turn the input from the workshop into a picture for the questions your monitoring system must be able to answer. Depending on how compound your requirements are, and what your capacity is, you may check to go for a computerised data base or a manual of arms one.If you want to be able to link many variables across many cases (e. g. participants, schools, parent involvement, resources, urban/ verdant etc), you may need to go the computer route. If you have a few variables, you can probably do it manually. The important thing is t o begin by cognize what variables you are interested in and to keep data on these variables. Linking and analysis can take side later. (These concepts are complicated. It will help you to read the case study in the examples section of the toolkit. ) From the workshop you will know what you want to monitor. You will have the indicators of efficiency, effectiveness and impact that have been prioritised.You will then carry the variables that will help you answer the questions you think are important. So, for example, you expertness have an indicator of impact which is that â€Å"safer waken options are chosen” as an indicator that â€Å" little people are now making sensible and mature lifestyle choices”. The variables that might affect the indicator include: _ Age _ Gender _ Religion _ urban/rural _ Economic category _ Family environment _ space of exposure to your project’s initiative _ play of workshops attended. By keeping the right information you will be able to answer questions such as: _ Does age make a difference to the way our message is received? _ Does economic category i. e. o young people in richer areas respond better or worse to the message or does it make no difference?_ Does the number of workshops attended make a difference to the impact? Answers to these kinds of questions enable a project or organisation to make decisions about what they do and how they do it, to make informed changes to programmes, and to beat their impact and effectiveness. Answers to questions such as: _ Do more people attend sessions that are organised well in advance? _ Do more schools infix when there is no charge? _ Do more young people attend when sessions are over weekends or in the flushings? _ Does it cost less to run a workshop in the community, or to solve people to our training centre to run the workshop? nable the project or organisation to measure and improve their efficiency.Step 3: set how you will collect the information yo u need (see collecting information) and where it will be unplowed (on computer, in manual files). Step 4: Decide how often you will analyse the information †this means putting it together and trying to answer the questions you think are important. Step 5: Collect, analyse, report. PURPOSE OF MONITORING AND EVALUATION What development interventions make a difference? Is the project having the intended results? What can be done contrastingly to better meet goals and objectives? These are the questions that monitoring and evaluation allow organizations to answer.Monitoring and evaluation are important management tools to track your progress and facilitate decision making. While some funders require some fictitious character of evaluative process, the greatest beneficiaries of an evaluation can be the community of people with whom your organization works. By nearly examining your work, your organization can design programs and activities that are effective, efficient, and open powerful results for the community. Definitions are as follows: Monitoring can be defined as a act function that aims primarily to provide the management and main stakeholders of an ongoing intervention with early indications of progress, or want thereof, in the achievement of results.An ongoing intervention might be a project, program or other kind of support to an outcome. Monitoring helps organizations track achievements by a regular battle array of information to pay heed timely decision making, ensure accountability, and provide the basis for evaluation and learning. STRATEGIC QUESTIONS In conducting monitoring and evaluation efforts, the special areas to consider will depend on the actual intervention, and its stated outcomes. Areas and examples of questions include: • Relevance: Do the objectives and goals match the problems or needs that are being addressed?• Efficiency: Is the project delivered in a timely and cost-effective manner? Effectiveness: To what e xtent does the intervention achieve its objectives? What are the supportive factors and obstacles encountered during the implementation? • Impact: What happened as a result of the project? This may include intended and unintended lordly and negative effects. • Sustainability: Are there lasting benefits after(prenominal) the intervention is completed? COMMON TERMS Monitoring and evaluation take place at different levels. The following box defines the common name with examples. INPUTS The financial, human, and hooey resources used for the development intervention. Technical Expertise Equipment bills ACTIVITIES Actions taken or work performed.Training workshops conducted OUTPUTS The products, capital goods, and function that result from a development intervention. Number of people trained Number of workshops conducted OUTCOMES The likely or achieved short and medium-term effects or changes of an intervention’s outputs. change magnitude skills New employment op portunities IMPACTS The long-term consequences of the program, may be positive and negative effects. Improved standard of spirit STEP-BY-STEP: Planning for Monitoring and Evaluation step for designing a monitoring and evaluation system depend on what you are trying to monitor and evaluate. The following is an outline of some general stairs you may ake in thinking through at the time of planning your activities:1. Identify who will be involved in the design, implementation, and reporting. Engaging stakeholders helps ensure their perspectives are understood and feedback is incorporated. 2. Clarify scope, purpose, intended use, audience, and budget for evaluation. 3. infract the questions to answer what you want to learn as a result of your work. 4. Select indicators. Indicators are meant to provide a clear means of measuring achievement, to help assess the performance, or to reflect changes. They can be either quantitative and/or qualitative. A process indicator is information tha t focuses on how a program is implemented. 5.Determine the data collection methods. Examples of methods are: document reviews, questionnaires, surveys, and interviews. 6. Analyze and compound the information you obtain. Review the information obtained to see if there are patterns or trends that emerge from the process. 7. Interpret these findings, provide feedback, and make recommendations. The process of analyzing data and actualizeing findings should provide you with recommendations about how to strengthen your work, as well as any mid-term adjustments you may need to make. 8. Communicate your findings and insights to stakeholders and decide how to use the results to strengthen your organization’s efforts.Monitoring and evaluation not only help organizations reflect and understand past performance, but serve as a guide for constructive changes during the period of implementation. Why have a detailed toolkit on monitoring and evaluation? If you don’t care about how well you are doing or about what impact you are having, why bother to do it at all? Monitoring and evaluation enable you to assess the tint and impact of your work, against your action plans and your strategic plan. In order for monitoring and evaluation to be really valuable, you do need to have planned well. Planning is dealt with in detail in other toolkits on this website. Who should use this toolkit?This toolkit should be useful to anyone working in an organisation or project who is concerned about the efficiency, effectiveness and impact of the work of the project or organisation. When will this toolkit be useful? This toolkit will be useful when: _ You are setting up systems for data collection during the planning phases of a project or organisation; _ You want to analyse data amass through the monitoring process; _ You are concerned about how efficiently and how effectively you are working; _ You electron orbit a stage in your project, or in the life of your organisation, when you think it would be useful to evaluate what impact the work is having; _ Donors ask for an external evaluation of your organisation and or work. DESIGNING A MONITORING SYSTEM †CASE STUDYWhat follows is a description of a process that a South African organisation called Puppets against back up went through in order to develop a monitoring system which would feed into monitoring and evaluation processes. The main work of the organisation is presenting workshopped plays and/or puppet shows related to lifeskill issues, especially those lifeskills to do with sexuality, at schools, across the country. The organisation works with a range of age groups, with different â€Å"products” (scripts) being appropriate at different levels. Puppets against AIDS valued to develop a monitoring and evaluation system that provided useful information on the efficiency, effectiveness and impact of its operations. To this end, it wanted to develop a data base that:Provided all the ba se information the organisation demand about clients and run given; _ Produced reports that enabled the organisation to inform itself and other stakeholders, including donors, partners and even schools, about the impact of the work, and what affected the impact of the work. The organisation do a decision to go for a computerised monitoring system. Much of the day-to-day information requisite by the organisation was already on a computerised data base (e. g. schools, regions, serve provided and so on), but the monitoring system would require a real(a) upgrading and the development of data base software specific to the organisation’s needs.The organisation also do the decision to develop a system initially for a pilot project, but with the intention of extending it to all the work over time. This pilot project would work with about 60 schools, using different scripts each year, over a period of three years. In order to raise the money demand for this process, Puppets ag ainst AIDS needed some kind of a brief for what was required so that it could be costed. At an initial workshop with staff, facilitated by consultants, the staff generated a list of indicators for efficiency, effectiveness and impact, in relation to their work. These were the things staff wanted to know from the system about what they did, how they did it, and what difference it made. The terms were defined as follows:Efficiency Here what needed to be assessed was how quickly, how correctly, how cost effectively and with what use of resources the services of the organisation were offered. Much of this information was already tranquil and was contained in reports which reflected planning against achievement. It needed to be made â€Å"computer friendly”. Effectiveness Here what needed to be assessed was getting results in terms of the strategy and shorter-term impact. For example, were the puppet shows an effective means of communicating messages about sexuality? Again, this information was already being collected and just needed to be adapted to fit the computerised system.Impact Here what needed to be assessed was whether the strategy worked in that it had an impact on ever-changing behaviour in individuals (in this case the students) and that that change in behaviour squeeze positively on Monitoring and Evaluation Monitoring and Evaluation by Janet Shapiro (email: [email protected] co. za that happens when a donor insists on it, in fact, monitoring and evaluation are invaluable internal management tools. If you don’t assess how well you are doing against targets and indicators, you may go on using resources to no useful end, without changing the situation you have identified as a problem at all. Monitoring and evaluation enable you to make that assessment.\r\n'

No comments:

Post a Comment