Skip to content

Lucastmarques/fine-tuning-house-price

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 

Repository files navigation

banner

Melbourn House Price Prediction using Fine Tuning Technique

GitHub release (latest by date including pre-releases) GitHub last commit GitHub issues GitHub pull requests GitHub All Releases GitHub Follow Star Tweet

As we are living in an economically fragile period with a pandemic of the New Coronavirus, an important thought emerged in the minds of many Brazilian investors, as well as common people looking for a permanent home.

Based on this assumption, this project is a method for creating a model for predicting house prices in Melbourne, Australia, using Python, TensorFlow2 and KerasTuner

With the techniques used in this work and having a good dataset in hands, you can solve almost any linear or non-linear regression problem using MLP.

This repository contains all code used in this article on Medium.

Demo-Preview

As we said before, the main key of this project is KerasTuner. The fucntion used to implement de Hyperparameter Tuning is shown below.

def model_builder(hp):
  model = tf.keras.Sequential()
  
  # Creates a model with 5 to 10 layers
  for i in range(hp.Int('num_layers', 5, 10)):

    # Tune the number of units in dense layers
    hp_int = hp.Int('units_' + str(i), min_value = 128, max_value = 768, step = 128)
    # Choose the best value between 128-768
    model.add(tf.keras.layers.Dense(units=hp_int, kernel_initializer='he_uniform'))
    # Add batch normalization before activation function
    model.add(tf.keras.layers.BatchNormalization())
    model.add(tf.keras.layers.Activation(tf.nn.relu))
    # Add Dropout
    model.add(tf.keras.layers.Dropout(0.2))
  # Output layer
  model.add(tf.keras.layers.Dense(1, activation=tf.nn.relu))

  # Tune the optmizer's learning rate by choosing the best value
  # between 0.01, 0.001 and 0.0001
  hp_learning_rate = hp.Choice('learning_rate', values=[1e-2, 1e-3, 1e-4])
  opt = tf.keras.optimizers.Adam(learning_rate=hp_learning_rate)

  # Configure the optmizer, loss and supervised metrics
  model.compile(optimizer=opt, loss='mape', metrics=['mae'])

  return model

Table of contents

Installation

(Back to top)

First of all, you have to have access to a Python enviroment and JupyterNotebook installed to open the main file HousePricePrediction.ipynb and run this project, or you can just forget all of that and simply create (probably you already have) a gmail account with access to Google Colab (which I prefer). Since this is basic, let's skip this part.

Assuming you have all of the above you just need to clone this repo.

git clone https://github.com/Lucastmarques/fine-tuning-house-price.git

And download all dependencies:

pip install requirements.txt

Development

(Back to top)

Since we use JupyterNotebook to develop the entire project, each section of the code is documented and explained in markdown. Futhermore, every snippet of code is well commented and organized to be as didactic as possible.

Contribute

(Back to top)

Adding new features or fixing bugs

(Back to top)

Feel free to raise any type of issues or feature requests in this project.

License

(Back to top)

The MIT License

Footer

(Back to top)

Leave a star in GitHub, give a clap in Medium and share this project if you found this helpful.

For more information, contact us by [lucastmarques07@gmail.com].

And don't forget to see my LinkedIn!