Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
menu search
person
Welcome To Ask or Share your Answers For Others

Categories

I wrote a code for multivariate polynomial regression, I used polynomial features and transformation function from sklearn. Is it possible to make multivariate logarithmic regression? Does sklearn have some kind of logarithmic transformation, like it has for polynomial features? How can I write multivariate logarithmic regression in python?

This is my code for multivariate polynomial features:

import numpy as np
import pandas as pd
import math
import xlrd
from sklearn import linear_model
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import PolynomialFeatures


#Reading data from excel

data = pd.read_excel("DataSet.xls").round(2)
data_size = data.shape[0]
#print("Number of data:",data_size,"
",data.head())

def polynomial_prediction_of_future_strength(input_data, cement, blast_fur_slug,fly_ash,
                                              water, superpl, coarse_aggr, fine_aggr, days):

    variables = prediction_accuracy(input_data)[2]
    results = prediction_accuracy(input_data)[3]
    n = results.shape[0]
    results = results.values.reshape(n,1) #reshaping the values so that variables and results have the same shape

    #transforming the data into polynomial function
    Poly_Regression = PolynomialFeatures(degree=2)
    poly_variables = Poly_Regression.fit_transform(variables)

    #accuracy of prediction(splitting the dataset on train and test)
    poly_var_train, poly_var_test, res_train, res_test = train_test_split(poly_variables, results, test_size = 0.3, random_state = 4)

    input_values = [cement, blast_fur_slug, fly_ash, water, superpl, coarse_aggr, fine_aggr, days]
    input_values = Poly_Regression.transform([input_values]) #transforming the data for prediction in polynomial function

    regression = linear_model.LinearRegression() #making the linear model
    model = regression.fit(poly_var_train, res_train) #fitting polynomial data to the model

    predicted_strength = regression.predict(input_values) #strength prediction
    predicted_strength = round(predicted_strength[0,0], 2)

    score = model.score(poly_var_test, res_test) #accuracy prediction
    score = round(score*100, 2)

    accuracy_info = "Accuracy of concrete class prediction: " + str(score) + " %
"
    prediction_info = "Prediction of future concrete class after "+ str(days)+" days: "+ str(predicted_strength) 

    info = "
" + accuracy_info + prediction_info

    return info

#print(polynomial_prediction_of_future_strength(data, 214.9 , 53.8, 121.9, 155.6, 9.6, 1014.3, 780.6, 7))
See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
714 views
Welcome To Ask or Share your Answers For Others

1 Answer

If you want to fit with the logarithms of your features, one option is the Box-Cox Transform then OLS, which you can apply in sklearn using the PowerTransformer. https://scikit-learn.org/stable/modules/generated/sklearn.preprocessing.PowerTransformer.html#sklearn.preprocessing.PowerTransformer


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
...