Simple application summary of various classification algorithms implemented by Python using sklearn library

  • 2021-07-09 08:42:35
  • OfStack

This paper describes the simple application of various classification algorithms implemented by Python using sklearn library. Share it for your reference, as follows:

KNN


from sklearn.neighbors import KNeighborsClassifier
import numpy as np
def KNN(X,y,XX) : #X,y  They are the data and labels of the training data set, XX For test data 
  model = KNeighborsClassifier(n_neighbors=10)# Default to 5
  model.fit(X,y)
  predicted = model.predict(XX)
  return predicted

SVM


from sklearn.svm import SVC
def SVM(X,y,XX):
  model = SVC(c=5.0)
  model.fit(X,y)
  predicted = model.predict(XX)
  return predicted

SVM Classifier using cross validation


def svm_cross_validation(train_x, train_y):
  from sklearn.grid_search import GridSearchCV
  from sklearn.svm import SVC
  model = SVC(kernel='rbf', probability=True)
  param_grid = {'C': [1e-3, 1e-2, 1e-1, 1, 10, 100, 1000], 'gamma': [0.001, 0.0001]}
  grid_search = GridSearchCV(model, param_grid, n_jobs = 1, verbose=1)
  grid_search.fit(train_x, train_y)
  best_parameters = grid_search.best_estimator_.get_params()
  for para, val in list(best_parameters.items()):
    print(para, val)
  model = SVC(kernel='rbf', C=best_parameters['C'], gamma=best_parameters['gamma'], probability=True)
  model.fit(train_x, train_y)
  return model

LR


from sklearn.linear_model import LogisticRegression
def LR(X,y , XX):
  model = LogisticRegression()
  model.fit(X,y)
  predicted = model.predict(XX)
  return predicted

Decision Tree (CART)


from sklearn.tree import DecisionTreeClassifier
def CTRA(X,y,XX):
  model = DecisionTreeClassifier()
  model.fit(X,y)
  predicted = model.predict(XX)
  return predicted

Random forest


from sklearn.ensemble import RandomForestClassifier
def CTRA(X,y,XX):
  model = RandomForestClassifier()
  model.fit(X,y)
  predicted = model.predict(XX)
  return predicted

GBDT(Gradient Boosting Decision Tree)


from sklearn.ensemble import GradientBoostingClassifier
def CTRA(X,y,XX):
  model = GradientBoostingClassifier()
  model.fit(X,y)
  predicted = model.predict(XX)
  return predicted

Naive Bayes: One is based on Gaussian distribution, one is based on polynomial distribution, and one is based on Bernoulli distribution.


from sklearn.naive_bayes import GaussianNB
from sklearn.naive_bayes import MultinomialNB
from sklearn.naive_bayes import BernoulliNB
def GNB(X,y,XX):
  model =GaussianNB()
  model.fit(X,y)
  predicted = model.predict(XX)
  return predicted
def MNB(X,y,XX):
  model = MultinomialNB()
  model.fit(X,y)
  predicted = model.predict(XX
  return predicted
def BNB(X,y,XX):
  model = BernoulliNB()
  model.fit(X,y)
  predicted = model.predict(XX
  return predicted

For more readers interested in Python related contents, please check the topics of this site: "Python Data Structure and Algorithm Tutorial", "Python Encryption and Decryption Algorithm and Skills Summary", "Python Coding Operation Skills Summary", "Python Function Use Skills Summary", "Python String Operation Skills Summary" and "Python Introduction and Advanced Classic Tutorial"

I hope this article is helpful to everyone's Python programming.


Related articles: