<?xml version="1.0" encoding="UTF-8"?>
<!-- generator="FeedCreator 1.8" -->
<?xml-stylesheet href="https://ease-crc.org/material/lib/exe/css.php?s=feed" type="text/css"?>
<rdf:RDF
    xmlns="http://purl.org/rss/1.0/"
    xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
    xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
    xmlns:dc="http://purl.org/dc/elements/1.1/">
    <channel rdf:about="https://ease-crc.org/material/feed.php">
        <title>EASE Material - ease:machinelearning</title>
        <description></description>
        <link>https://ease-crc.org/material/</link>
        <image rdf:resource="https://ease-crc.org/material/_media/wiki/dokuwiki.svg" />
       <dc:date>2026-04-04T10:14:49+00:00</dc:date>
        <items>
            <rdf:Seq>
                <rdf:li rdf:resource="https://ease-crc.org/material/ease/machinelearning/classifier_evaluation?rev=1592826559&amp;do=diff"/>
                <rdf:li rdf:resource="https://ease-crc.org/material/ease/machinelearning/classifier_training?rev=1592826364&amp;do=diff"/>
                <rdf:li rdf:resource="https://ease-crc.org/material/ease/machinelearning/data_preparation?rev=1592825999&amp;do=diff"/>
                <rdf:li rdf:resource="https://ease-crc.org/material/ease/machinelearning/decision_trees?rev=1592826161&amp;do=diff"/>
                <rdf:li rdf:resource="https://ease-crc.org/material/ease/machinelearning/machine_learning_theory?rev=1592820236&amp;do=diff"/>
                <rdf:li rdf:resource="https://ease-crc.org/material/ease/machinelearning/visualizing_the_data?rev=1592825866&amp;do=diff"/>
            </rdf:Seq>
        </items>
    </channel>
    <image rdf:about="https://ease-crc.org/material/_media/wiki/dokuwiki.svg">
        <title>EASE Material</title>
        <link>https://ease-crc.org/material/</link>
        <url>https://ease-crc.org/material/_media/wiki/dokuwiki.svg</url>
    </image>
    <item rdf:about="https://ease-crc.org/material/ease/machinelearning/classifier_evaluation?rev=1592826559&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2020-06-22T11:49:19+00:00</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>classifier_evaluation</title>
        <link>https://ease-crc.org/material/ease/machinelearning/classifier_evaluation?rev=1592826559&amp;do=diff</link>
        <description>NEEMS Lecture: 6. Evaluate the Next Action Classifier

In the previous section we trained our decision tree model. With this last section this model is to be evaluated.

Now it comes to evaluating what the tree model is capable of. The purpose of this model is to predict which action is the most likely to happen, depending on the previously performed action and its context. Simply execute the code blocks to see the outcome.</description>
    </item>
    <item rdf:about="https://ease-crc.org/material/ease/machinelearning/classifier_training?rev=1592826364&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2020-06-22T11:46:04+00:00</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>classifier_training</title>
        <link>https://ease-crc.org/material/ease/machinelearning/classifier_training?rev=1592826364&amp;do=diff</link>
        <description>NEEMS Lecture: 5. Train the Next Action Classifier

In the previous section we talked about some machine learning theory. This section will cover the training of our decision tree model.

First, the prepared_narratives that were created earlier in this lecture, are split between train_set and test_set. Then the train_set is split into the features and labels, where the features contain the previous and parent actions for each action, and the labels contain the data about which action comes after…</description>
    </item>
    <item rdf:about="https://ease-crc.org/material/ease/machinelearning/data_preparation?rev=1592825999&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2020-06-22T11:39:59+00:00</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>data_preparation</title>
        <link>https://ease-crc.org/material/ease/machinelearning/data_preparation?rev=1592825999&amp;do=diff</link>
        <description>NEEMS Lecture: 2. Data Preparation

In the previous section we started off by visualizing the NEEMS data as pie charts and tables. This section is about preparing the data for training, including filling empty data cells, transforming the data in one-hot-encoding and shrinking down the tables to the relevant bits.</description>
    </item>
    <item rdf:about="https://ease-crc.org/material/ease/machinelearning/decision_trees?rev=1592826161&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2020-06-22T11:42:41+00:00</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>decision_trees</title>
        <link>https://ease-crc.org/material/ease/machinelearning/decision_trees?rev=1592826161&amp;do=diff</link>
        <description>NEEMS Lecture: 3. Brief Introduction to Decision Trees

In the previous section we prepared the NEEMS data for training in a one-hot-encoding manner. Here we will get into decision trees, how they are built and how to read them.

The statistic model of choice is a decision tree. Such models have the advantage of being visually inspectable and comprehendible, as well as more or less easy to understand. Based on the Information about the</description>
    </item>
    <item rdf:about="https://ease-crc.org/material/ease/machinelearning/machine_learning_theory?rev=1592820236&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2020-06-22T10:03:56+00:00</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>machine_learning_theory</title>
        <link>https://ease-crc.org/material/ease/machinelearning/machine_learning_theory?rev=1592820236&amp;do=diff</link>
        <description>NEEMS Lecture: 4. Additional Machine Learning Theory

Previously we talked about decision trees. This section explains some of the terms in short. Please follow the lecture for more information on this section.

Cross-validation is a technique of training a model, where training-set and testing-set are interchanged a couple of times, to potentially exclude valleys of falsely learned influence of features.</description>
    </item>
    <item rdf:about="https://ease-crc.org/material/ease/machinelearning/visualizing_the_data?rev=1592825866&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2020-06-22T11:37:46+00:00</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>visualizing_the_data</title>
        <link>https://ease-crc.org/material/ease/machinelearning/visualizing_the_data?rev=1592825866&amp;do=diff</link>
        <description>NEEMS Lecture: 1. Data Analysis

In the first section, we analyze a log from an actual robot performance. Every action the robot has done within a chain of tasks is recorded with the help of the knowledge base framework KnowRob. The output has already been prepared as a CSV file and can be loaded in this Python context. With the</description>
    </item>
</rdf:RDF>
