This is a very basic and simplistic collaborative filtering based recommender system. It uses a subset of the MovieLens dataset provided by GroupLens.
As of right now, the recommender generates a list of top 10 recommendations for any user in the dataset. Below are the steps on how it …
fromnumpyimportexp,array,random,dot# Define neural network classclassNeuralNetwork():def__init__(self):#Seed random number generatorrandom.seed(1)# Model single neuron with 3 inputs and 1 output# Assign random weights to a 3 x 1 matrix with values between -1 and 1self.synaptic_weights=2 …
fromnumpyimport*defcomputer_error_for_line_given_points(b,m,points):# Initialize error at 0totalError=0# Loop through all pointsforiinrange(0,len(points)):# Get x valuex=points[i,0]# Get y valuey=points[i,0]# Get squared difference and add to total errortotalError+=(y-(m …
The purpose of this exercise is to use deep neural networks to classify traffic signs. Specifically, we train a model to classify traffic signs from the German Traffic Sign Dataset.
Data
The pickled data is a dictionary with 4 key/value pairs:
In this project, we use the following tools to identify lane lines on the road: Color selection Region of interest selection Grayscaling Gaussian smoothing Canny Edge Detection Hough Tranform line detection
We develop a pipeline on a series of individual images, and later apply …
This visualization describes the percentage of passengers that surived during the Titanic disaster. The circles represent the susbet of passengers categorized by sex, passenger class and point of embarkation. The size of the circle corresponds to the percent value; the higher the proportion of …
In this experiment, Udacity tested the change where if the student clicked “Start free trial” on the home page, they were asked about their availability to commit to the course as can be seen here. If the student indicated that they had fewer than 5 hours per week …
The goal of this project is to identify employees from Enron that have been in on the fraud committed that came to light in 2001. This will be based on a machine learning algorithm using public Enron financial and email information.
Map Area: Waterloo, Ontario, Canada
Problems Encountered in the Map:
After initially downloading the Waterloo area and running it against the script below, I noticed the following problems with the data:
• Inconsistent methods of describing street direction (e.g. “N” or “North”)
• Inconsistent street types (e.g. St. instead …
# Import csv libraryimportunicodecsv# Define function to read and store datadefread_csv(filename):withopen(filename,'rb')asf:reader=unicodecsv.DictReader(f)returnlist(reader)enrollments=read_csv('data/enrollments.csv')daily_engagement=read_csv('data/daily_engagement.csv')project_submissions=read_csv('data/project_submissions.csv')# Print out first …
This project is the result of a Udacity project in which a deep learning model is trained to drive a vehicle autonomously. The simulation environment is provided by Udacity to train and test the models. The video of the result can be viewed here. The numbers scrolling on the …
A few researchers set out to determine the optimal length of chopsticks for children and adults. They came up with a measure of how effective a pair of chopsticks performed, called the "Food Pinching Performance." The "Food Pinching Performance" was determined by counting the number …
This script gathers ticker data on the tickers in the TickerList.csv file and calculates the following metrics:
- Simple moving averages
- RSI
- Exponential moving averages
- MACD
Then the tickers are filtered using the following user-defined parameters:
- MinPrice
- MaxPrice
- MinRSI
- MaxRSI
- MinVol
- MA1
- MA2
The Baseball Data was used to answer the following questions:
Are salaries higher in 2015 than in 1985?
Does the mean salary increase with the year?
Do a player salaries have a higher correlation with total runs or homeruns?