diff --git a/section-4-unstructured-data-projects/end-to-end-dog-vision-v2.ipynb b/section-4-unstructured-data-projects/end-to-end-dog-vision-v2.ipynb index b68c4892e..177d21df6 100644 --- a/section-4-unstructured-data-projects/end-to-end-dog-vision-v2.ipynb +++ b/section-4-unstructured-data-projects/end-to-end-dog-vision-v2.ipynb @@ -2851,7 +2851,7 @@ "id": "WYwlNwfjdMEp" }, "source": [ - "## TK - 4. Creating training and test data split directories\n", + "## 4. Creating training and test data split directories\n", "\n", "After exploring the data, one of the next best things you can do is create experimental data splits.\n", "\n", @@ -3294,7 +3294,7 @@ "id": "QUYK0GNLS7on" }, "source": [ - "### TK - Making a 10% training dataset split\n", + "### Making a 10% training dataset split\n", "\n", "We've already split the data into training and test sets, so why might we want to make another split?\n", "\n", @@ -3966,7 +3966,7 @@ "id": "TUlSbEnIh-PO" }, "source": [ - "## TK - 5. Turning datasets into TensorFlow Dataset(s)\n", + "## 5. Turning datasets into TensorFlow Dataset(s)\n", "\n", "Alright, we've spent a bunch of time getting our dog images into different folders.\n", "\n", @@ -4298,7 +4298,7 @@ "id": "pclcbSv3VPv2" }, "source": [ - "### TK - Visualizing images from our TensorFlow Dataset\n", + "### Visualizing images from our TensorFlow Dataset\n", "\n", "Let's follow the data explorer's motto once again and *visualize, visualize, visualize!*\n", "\n", @@ -4399,7 +4399,7 @@ "id": "07e-LqpXwu5K" }, "source": [ - "### TK - Getting labels from our TensorFlow Dataset\n", + "### Getting labels from our TensorFlow Dataset\n", "\n", "Since our data is now in `tf.data.Dataset` format, there are a couple of important attributes we can pull from it if necessary.\n", "\n", @@ -4508,7 +4508,7 @@ "id": "vTE3j1OKnJ6k" }, "source": [ - "### TK - Configuring our datasets for performance\n", + "### Configuring our datasets for performance\n", "\n", "There's one last step we're going to do before we build our first TensorFlow model.\n", "\n", @@ -4587,7 +4587,7 @@ "id": "teWeV33en5Cq" }, "source": [ - "## TK - 6. Creating a neural network with TensorFlow\n", + "## 6. Creating a neural network with TensorFlow\n", "\n", "We've spent lots of time preparing the data.\n", "\n", @@ -4605,9 +4605,11 @@ "\n", "A neural network often follows the structure of:\n", "\n", - "Input layer -> Middle layer(s) -> Output layer\n", + "Input layer -> Middle layer(s) -> Output layer.\n", "\n", - "TK - image of neural network with example\n", + "\n", + "\n", + "*General anatomy of a neural network. Neural networks are almost infinitely customisable. The main premise is that data goes in one end, gets manipulated by many small functions in an attempt to learn patterns/weights which represent the data to produce useful outputs. Note that \"patterns\" is an arbitrary term, you’ll often hear \"embedding\", \"weights\", \"feature representation\", \"representation\" all referring to similar things.*\n", "\n", "Where the input layer takes in the data, the middle layer(s) perform calculations on the data and (hopefully) learn patterns (also called weights/biases) to represent the data and the output layer performs a final transformation on the learned patterns to make them usable in human applications.\n", "\n", @@ -4640,6 +4642,8 @@ "source": [ "### TK - The magic of transfer learning\n", "\n", + "UPTOHERE\n", + "\n", "**Transfer learning** is the process of getting an existing working model and adjusting it to your own problem.\n", "\n", "This works particularly well for neural networks.\n", @@ -4664,7 +4668,7 @@ "* [`Hugging Face Models Hub`](https://huggingface.co/models) - A large collection of pretrained models on a wide range on tasks, from computer vision to natural language processing to audio processing. \n", "* [`Kaggle Models`](https://www.kaggle.com/models) - A huge collection of different pretrained models for many different tasks.\n", "\n", - "TK - image of where to find pretrained models\n", + "TK - image/table of where to find pretrained models\n", "\n", "> **Note:** For most new machine learning problems, if you're looking to get good results quickly, you should generally look for a pretrained model similar to your problem and use transfer learning to adapt it to your own domain.\n", "\n",