Researchers Have Created an AI That Is Naturally Curious

Researchers Have Created an #AI That Is Naturally Curious 

 #fintech @futurism

  • Researchers have successfully given AI a curiosity implant, which motivated it to explore a virtual environment.
  • This could be the bridge between AI and real world application

    Researchers at the University of California (UC), Berkeley, have produced an artificial intelligence (AI) that is naturally curious.

  • While the AI that was not equipped with the curiosity ‘upgrade’ banged into walls repeatedly, the curious AI explored its environment in order to learn more.
  • This is a useful and effective strategy for teaching AI to complete specific tasks — as shown by the AI who beat the AlphaGo world number one — but less useful when you want a machine to be autonomous and operate outside of direct commands.
  • This is crucial step to integrating AI into the real world and having it solve real world problems because, as Agrawal says, “rewards in the real world are very sparse.”

Researchers have successfully given AI a curiosity implant, which motivated it to explore a virtual environment.
Continue reading “Researchers Have Created an AI That Is Naturally Curious”

A Simple XGBoost Tutorial Using the Iris Dataset

#ICYMI A Simple XGBoost Tutorial Using the Iris Dataset  #MachineLearning

  • It is important to install it using Anaconda (in Anaconda’s directory), so that pip installs other libs there as well:

    Now, a very important step: install xgboost Python Package dependencies beforehand.

  • I install these ones from experience:

    I upgrade my python virtual environment to have no trouble with python versions:

    And finally I can install xgboost with pip (keep fingers crossed):

    This command installs the latest xgboost version, but if you want to use a previous one, just specify it with:

    Now test if everything is has gone well – type python in the terminal and try to import xgboost:

    If you see no errors – perfect.

  • First you load the dataset from sklearn, where X will be the data, y – the class labels:

    Then you split the data into train and test sets with 80-20% split:

    Next you need to create the Xgboost specific DMatrix data format from the numpy array.

  • Xgboost can work with numpy arrays directly, load data from svmlignt files and other formats.
  • Here is how to work with numpy arrays:

    If you want to use svmlight for less memory consumption, first dumpthe numpy array into svmlight format and then just pass the filename to DMatrix:

    Now for the Xgboost to work you need to set the parameters:

    Different datasets perform better with different parameters.


This is an overview of the XGBoost machine learning algorithm, which is fast and shows good results. This example uses multiclass prediction with the Iris dataset from Scikit-learn.

Continue reading “A Simple XGBoost Tutorial Using the Iris Dataset”

A Simple XGBoost Tutorial Using the Iris Dataset

A Simple XGBoost Tutorial Using the Iris Dataset  #MachineLearning

  • It is important to install it using Anaconda (in Anaconda’s directory), so that pip installs other libs there as well:

    Now, a very important step: install xgboost Python Package dependencies beforehand.

  • I install these ones from experience:

    I upgrade my python virtual environment to have no trouble with python versions:

    And finally I can install xgboost with pip (keep fingers crossed):

    This command installs the latest xgboost version, but if you want to use a previous one, just specify it with:

    Now test if everything is has gone well – type python in the terminal and try to import xgboost:

    If you see no errors – perfect.

  • First you load the dataset from sklearn, where X will be the data, y – the class labels:

    Then you split the data into train and test sets with 80-20% split:

    Next you need to create the Xgboost specific DMatrix data format from the numpy array.

  • Xgboost can work with numpy arrays directly, load data from svmlignt files and other formats.
  • Here is how to work with numpy arrays:

    If you want to use svmlight for less memory consumption, first dumpthe numpy array into svmlight format and then just pass the filename to DMatrix:

    Now for the Xgboost to work you need to set the parameters:

    Different datasets perform better with different parameters.


This is an overview of the XGBoost machine learning algorithm, which is fast and shows good results. This example uses multiclass prediction with the Iris dataset from Scikit-learn.

Continue reading “A Simple XGBoost Tutorial Using the Iris Dataset”