onepl_mml
def onepl_mml(dataset, alpha=None, options=None)Estimates parameters in an 1PL IRT Model.
Args
dataset- [items x participants] matrix of True/False Values
alpha- [int] discrimination constraint
options- dictionary with updates to default options
Returns
discrimination- (float) estimate of test discrimination
difficulty- (1d array) estimates of item diffiulties
Options
- distribution: callable
- quadrature_bounds: (float, float)
- quadrature_n: int
Expand source code
def onepl_mml(dataset, alpha=None, options=None): """ Estimates parameters in an 1PL IRT Model. Args: dataset: [items x participants] matrix of True/False Values alpha: [int] discrimination constraint options: dictionary with updates to default options Returns: discrimination: (float) estimate of test discrimination difficulty: (1d array) estimates of item diffiulties Options: * distribution: callable * quadrature_bounds: (float, float) * quadrature_n: int """ options = validate_estimation_options(options) quad_start, quad_stop = options['quadrature_bounds'] quad_n = options['quadrature_n'] # Difficulty Estimation parameters n_items = dataset.shape[0] n_no, n_yes = get_true_false_counts(dataset) scalar = n_yes / (n_yes + n_no) unique_sets, counts = np.unique(dataset, axis=1, return_counts=True) the_sign = convert_responses_to_kernel_sign(unique_sets) discrimination = np.ones((n_items,)) difficulty = np.zeros((n_items,)) # Quadrature Locations theta = _get_quadrature_points(quad_n, quad_start, quad_stop) distribution = options['distribution'](theta) # Inline definition of cost function to minimize def min_func(estimate): discrimination[:] = estimate _mml_abstract(difficulty, scalar, discrimination, theta, distribution, options) partial_int = _compute_partial_integral(theta, difficulty, discrimination, the_sign) # add distribution partial_int *= distribution otpt = integrate.fixed_quad( lambda x: partial_int, quad_start, quad_stop, n=quad_n)[0] return -np.log(otpt).dot(counts) # Perform the minimization if alpha is None: # OnePL Method alpha = fminbound(min_func, 0.25, 10) else: # Rasch Method min_func(alpha) return alpha, difficulty
Last modified April 7, 2020: Updating documentation. (3da0254)