Python Forum

Full Version: Question on None function in a machine learning algorithm
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
def evaluate_algorithm(dataset, algorithm, n_folds, *args):
	folds = cross_validation_split(dataset, n_folds)
	scores = list()
	for fold in folds:
		train_set = list(folds)
		train_set.remove(fold)
		train_set = sum(train_set, [])
		test_set = list()
		for row in fold:
			row_copy = list(row)
			test_set.append(row_copy)
			row_copy[-1] = None
		predicted = algorithm(train_set, test_set, *args)
		actual = [row[-1] for row in fold]
		rmse = rmse_metric(actual, predicted)
		scores.append(rmse)
	return scores
I was wondering if anyone could explain the for loop and what it does for the function? specifically why row_copy[-1] = None is necessary
It appears that the last element of the row_copy list is being "removed" by setting it to None.