Expand AllClick here for a more accessible version
- District Admin account holder
- School Admin account holder from your school
- Users to whom the District Admin and School Admin account holders grant access
- Often this is:
- Assistant Superintendent
- Assistant Principal
- Those in a supervisory role in charge of evaluations
Why did I not get a report?
- Did not provide instruction in a PVAAS reported grade/subject/Keystone content area
- ELA and Math, Grades 4-8
- Science, Grades 4 and 8
- Keystone Algebra I, Literature, Biology
- Did not have 11 students in one of the PVAAS reported grades/subjects/Keystone content areas
- Did not have 6 full-time equivalent students in one of the PVAAS reported grades/subjects/Keystone content areas
Why can’t I see my students’ data?
- Access to student-level data is controlled locally by each LEA/district
- School Admin account holders or School Users with Account Management privileges are able to edit teachers’ accounts to grant or remove access to student-level data
- Manually through “Admin” link then “Modify Access” to teacher’s account
- Batch creation with Staff Template file in PIMS
- Contact the PVAAS Statewide Team (email@example.com) for assistance on how to grant this access to teachers
Can I still see the roster data I verified last spring?
- Yes, all roster data are available for reference
- Access via the Reports menu under Roster Verification
- Available for up to 3 years
Which students are included in my reporting?
- To actually be included in PVAAS teacher-specific reporting, a student MUST:
- Have a PSSA or Keystone score in the current school year
- Not be Proficient or Advanced on a previous, specified Keystone exam
- Be claimed at a minimum of 10% total instructional responsibility
- Not be a foreign exchange student
- Not be a first-year EL student
- Not have tested with the PASA (alternate assessment)
What does the Composite mean?
- A combined measure of the growth indices
- Includes all tested grades, subjects, and courses for which a teacher received a Value-Added report
- For up to three consecutive years
- One year of data – 1-year composite
- Two years of data – 2-year composite
- Three years of data – 3-year composite
- Consecutive means:
- Any assessed grade, subject, or Keystone content area
- Does NOT need to be in the same grade, subject, or Keystone content area
What does the Standard Error mean on my report?
- A way of indicating how much evidence there is in the growth measure
- Smaller standard error --> More evidence
- Protection for teachers to account for things such as missing data, small numbers of students, etc.
- The Average Growth Index (AGI) accounts for the Standard Error
- AGI = Growth Measure / Standard Error
How did I do compared to other teachers in my school, district, and in the state?
- PVAAS teacher-specific reporting is confidential information. This information is NOT reported at the district or school level so that no single teacher would be identifiable.
- Teachers can see a statewide comparison to other teachers in the same grade and subject, or Keystone content area by viewing the Distribution of Teachers table found on their Value-Added Reports.
What do the PVAAS colors really mean on my Value Added report?
- Green means that there is evidence that the student group, on average, maintained their achievement.
- A light blue means there is moderate (or some) evidence that the student group, on average, gained ground in regard to achievement. Similarly, yellow means there is moderate (or some) evidence that the student group, on average, fell behind in regard to achievement.
- A dark blue would mean there is significant evidence that the student group, on average, gained ground in regard to achievement. And, a red would mean there is significant evidence that the student group, on average, fell behind in regard to achievement.
Am I at a disadvantage because I have a small number of students?
- No! Teachers are protected by the use of the Average Growth Index, which accounts for the value-added estimate and the standard error together.
- The correlation between teachers’ Average Growth Index and teachers’ actual number of students is 0.03685.
The scatterplot, shown above, plots the number of students used in each teacher’s PVAAS value-added report against the teacher’s average growth index. Each dot represents one teacher.
The scatterplot demonstrates that teachers serving both small and large numbers of students can show both high and low growth, as measured by PVAAS.
How can my students show growth when they are already very high (or very low) achieving?
- PVAAS measures growth for a GROUP of students.
- High achievement can be defined in many ways – not just the % of students reaching proficiency or higher.
- Proficient and Advanced are RANGES of performance.
- PVAAS does NOT measure growth by a change in performance levels.
- PA’s state assessments have enough stretch to measure growth for both high-achieving and low-achieving students.
- The graph, shown below, plots the average entering achievement for the students served by an individual teacher in Pennsylvania against the teacher’s Average Growth Index. Each dot represents one teacher.
- Regardless of the achievement of the teacher’s group of students, there is essentially little correlation between student achievement and the Average Growth Index of the group of students.
- In Pennsylvania, we tend to assess over 750,000 students with the PSSA each year. In any given year, there are typically less than 1,000 of those students scoring at the highest point of the Advanced range on the PSSA – that’s less than half a percent statewide.
- Also, in Pennsylvania, we usually assess between 150,000 and 200,000 students each year with the Keystone exams. There are typically only 10 to 75 of those students who score at the highest point of the advanced range on the Keystone exams – that’s less than a tenth of a percent of all students who test statewide.
If my students lost ground the year prior, do they have to make that up to be able to show growth with me this year?
- Growth, as measured by PVAAS, is measured by comparing students entering achievement to their ending achievement – not against what they grew (or didn’t grow) the year prior.
- Students serve as their own control; the student group is compared to themselves and NOT a prior group.
My students did well on our common assessments. Why am I not getting blue or green for growth in PVAAS?
- Common assessments are often used as formative or benchmark assessments to guide instruction. These assessments may not assess all eligible content – or, the same eligible content – or, the same weighting of the eligible content.
- Achievement on any assessment is about performance at a specific point in time.
- Proficiency or higher is about achievement – not growth.
- PVAAS value-added reporting follows the progress of students over time, regardless of achievement level.
Why isn’t attendance taken into account?
- The district, school, and individual teachers each have a role in preventing and intervening with student attendance issues.
- Teacher-specific strategies include areas such as high expectations, engaging instruction, relationship building with students, parent communication, incentive programs, etc.
- Students can be dropped or un-enrolled from a subject/grade/course based on LEA policy.
- Is the concern chronic absenteeism?
- This is often related to concerns about high poverty students or other socioeconomic factors that are related to low achievement.
- To the extent these factors are likely to be similar from year to year, using all available prior test scores enables each student to serve as their own control.
- The growth expectation will indirectly take these factors into account, since it is taking the students’ previous performance and entering achievement into account.
Why can’t I calculate my own value-added score?
- PVAAS statistical models are robust and sophisticated.
- To run this level of statistical models requires statistical software and computer processors far beyond that of an “everyday” computer.
- Historical statewide data are used in the statistical models to accurately and appropriately capture the prior achievement of groups of students.
- Not available beyond the state level due to confidentiality of student level data
- Correlations are calculated between years and assessments and are reflected in the statistical models
- All data and statistical models are run simultaneously statewide to capture reliable and valid value added measures at the district, school, and teacher levels.
Why should I trust the methodology or statistical models used?