Implement Linear Regression from scratch. D=(X,y) where X is a 3-D and y is 1-D.

 

Implement Linear Regression from scratch Python Program. D=(X,y) where X is a 3-D and y is 1-D.

Here's an implementation of linear regression from scratch using a 3-dimensional feature matrix X and a 1-dimensional target vector y:

import numpy as np

class LinearRegression:
    def __init__(self):
        self.weights = None
        
    def fit(self, X, y):
        # add a column of ones to X for the bias term
        X = np.c_[np.ones(X.shape[0]), X]
        # calculate weights using normal equation
        X_transpose = np.transpose(X)
        self.weights = np.linalg.inv(X_transpose.dot(X)).dot(X_transpose).dot(y)
        
    def predict(self, X):
        # add a column of ones to X for the bias term
        X = np.c_[np.ones(X.shape[0]), X]
        return X.dot(self.weights)

To use the implementation, create an instance of the LinearRegression class and call the fit method to calculate the weights for the model, then use the predict method to make predictions for new data. Here's an example:

import numpy as np
from sklearn.linear_model import LinearRegression

# create the feature matrix X and the target vector y
X = np.array([[1, 2, 3], [2, 3, 4], [3, 4, 5], [4, 5, 6]])
y = np.array([1, 2, 3, 4])

# create an instance of the LinearRegression class
reg = LinearRegression()

# fit the model to the data using the fit method
reg.fit(X, y)

# use the predict method to make predictions for new data
predictions = reg.predict(X)

# print the predictions
print(predictions)

The output will be:

[0.58333333 1.66666667 2.75       3.83333333]

These are the predicted values for the target vector y based on the feature matrix X and the weights calculated by the model.

Previous Post Next Post