The Clearinghouse on Assessment and Evaluation (Ericae.net) is a comprehensive online resource dedicated to advancing knowledge in educational assessment, evaluation, and research methodology. Whether you are a student, educator, researcher, or policymaker, Ericae provides the tools, insights, and guidance needed to understand how learning is measured—and how those measurements can be used to improve outcomes.
At its core, Ericae is built to bridge the gap between theory and practice. You’ll find foundational content that explains essential concepts like validity, reliability, and different types of assessment, alongside more advanced material covering psychometrics, data analysis, and research design. For those actively working in education, the site also offers practical strategies for classroom assessment, program evaluation, and data-driven decision-making.
Ericae goes beyond traditional content by serving as a true resource hub. Visitors can explore curated reading paths through the Assessment Library, access peer-reviewed research from the Practical Assessment, Research & Evaluation (PARE) journal, and utilize tools like the Test Locator and ERIC database guides to find high-quality academic resources. Downloadable templates, rubrics, and evaluation tools are also available to support real-world application.
As education continues to evolve, Ericae remains focused on the future—covering topics like digital assessment, artificial intelligence, equity in measurement, and responsible data use. The goal is not just to inform, but to empower users to make better decisions, design better assessments, and contribute to more effective and equitable educational systems.
Whether you’re just getting started or looking to deepen your expertise, Ericae is your trusted guide to understanding and improving educational assessment and evaluation.

Designing Questions for Diverse Learners
Designing questions for diverse learners helps educators create fairer assessments, reveal true understanding, and support every student.

Writing Clear and Unambiguous Test Questions
Learn how writing clear and unambiguous test questions improves exam fairness, reduces confusion, and helps assessments measure what matters.

Common Pitfalls in Item Writing (and How to Fix Them)
Avoid common item writing mistakes that weaken assessment quality. Learn practical fixes to build clearer, stronger items that measure correctly.

How to Review and Revise Test Questions Effectively
Learn how to review and revise test questions so assessments measure real learning, reduce weak items, and produce fairer, more accurate results.

Item Writing Guidelines for Standardized Tests
Learn item writing guidelines for standardized tests that improve validity, fairness, and score clarity so your assessments produce stronger results.

Using Real-World Contexts in Assessment Questions
Use real-world contexts in assessment questions to boost validity, engagement, and practical insight—so results better reflect real decisions.

Writing Questions That Assess Higher-Order Thinking
Learn how to write questions that assess higher-order thinking and create stronger assessments that truly measure analysis and problem solving.

Avoiding Bias in Test Item Writing
Avoiding bias in test item writing protects score validity and fairness, helping you create trusted assessments that measure what matters.

What Makes a Good Test Item? Key Principles
Learn what makes a good test item with clear, fair, aligned principles that help you write better assessments and measure learning with confidence.

Designing Performance-Based Assessment Tasks
Designing performance-based assessment tasks that align to standards, reveal real learning, and deliver fair, useful results for better teaching.

Short Answer vs. Essay Questions: When to Use Each
Learn when to use short answer vs. essay questions to assess the right skills, score more fairly, and improve the quality of your tests.

How to Write High-Quality Essay Questions
Learn how to write high-quality essay questions that test analysis and reasoning, helping you design better assessments that reveal real understanding

Constructed Response vs. Selected Response Items
Constructed response vs. selected response items explained simply—compare validity, scoring, alignment, and test-taker impact to choose better assessments

Best Practices for Writing Distractors in MCQs
Learn best practices for writing distractors in MCQs to create fairer questions, reduce guessing, and measure real understanding more accurately.

Common Flaws in Multiple-Choice Questions (and Fixes)
Spot common flaws in multiple-choice questions and fix them fast with practical tips to write clearer, fairer assessments that better measure learning.
