Skip to content

Commit

Permalink
update dog vision v2
Browse files Browse the repository at this point in the history
  • Loading branch information
mrdbourke committed Apr 12, 2024
1 parent 7c9d62d commit 921451b
Showing 1 changed file with 14 additions and 10 deletions.
24 changes: 14 additions & 10 deletions section-4-unstructured-data-projects/end-to-end-dog-vision-v2.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -2851,7 +2851,7 @@
"id": "WYwlNwfjdMEp"
},
"source": [
"## TK - 4. Creating training and test data split directories\n",
"## 4. Creating training and test data split directories\n",
"\n",
"After exploring the data, one of the next best things you can do is create experimental data splits.\n",
"\n",
Expand Down Expand Up @@ -3294,7 +3294,7 @@
"id": "QUYK0GNLS7on"
},
"source": [
"### TK - Making a 10% training dataset split\n",
"### Making a 10% training dataset split\n",
"\n",
"We've already split the data into training and test sets, so why might we want to make another split?\n",
"\n",
Expand Down Expand Up @@ -3966,7 +3966,7 @@
"id": "TUlSbEnIh-PO"
},
"source": [
"## TK - 5. Turning datasets into TensorFlow Dataset(s)\n",
"## 5. Turning datasets into TensorFlow Dataset(s)\n",
"\n",
"Alright, we've spent a bunch of time getting our dog images into different folders.\n",
"\n",
Expand Down Expand Up @@ -4298,7 +4298,7 @@
"id": "pclcbSv3VPv2"
},
"source": [
"### TK - Visualizing images from our TensorFlow Dataset\n",
"### Visualizing images from our TensorFlow Dataset\n",
"\n",
"Let's follow the data explorer's motto once again and *visualize, visualize, visualize!*\n",
"\n",
Expand Down Expand Up @@ -4399,7 +4399,7 @@
"id": "07e-LqpXwu5K"
},
"source": [
"### TK - Getting labels from our TensorFlow Dataset\n",
"### Getting labels from our TensorFlow Dataset\n",
"\n",
"Since our data is now in `tf.data.Dataset` format, there are a couple of important attributes we can pull from it if necessary.\n",
"\n",
Expand Down Expand Up @@ -4508,7 +4508,7 @@
"id": "vTE3j1OKnJ6k"
},
"source": [
"### TK - Configuring our datasets for performance\n",
"### Configuring our datasets for performance\n",
"\n",
"There's one last step we're going to do before we build our first TensorFlow model.\n",
"\n",
Expand Down Expand Up @@ -4587,7 +4587,7 @@
"id": "teWeV33en5Cq"
},
"source": [
"## TK - 6. Creating a neural network with TensorFlow\n",
"## 6. Creating a neural network with TensorFlow\n",
"\n",
"We've spent lots of time preparing the data.\n",
"\n",
Expand All @@ -4605,9 +4605,11 @@
"\n",
"A neural network often follows the structure of:\n",
"\n",
"Input layer -> Middle layer(s) -> Output layer\n",
"Input layer -> Middle layer(s) -> Output layer.\n",
"\n",
"TK - image of neural network with example\n",
"<img src=\"https://github.com/mrdbourke/zero-to-mastery-ml/blob/master/images/unstructured-data-anatomy-of-a-neural-network.png?raw=true\" width=750 alt=\"This image illustrates a simple neural network diagram with one input layer, one hidden layer, and one output layer. The input layer has 2 neurons, the hidden layer has 3 neurons, and the output layer has 1 neuron. The diagram shows connections between neurons across layers. There are annotations indicating that the neural network can be customized for almost any kind of input and output, and although the number and type of hidden layers are infinitely customizable, there is often a practical upper limit. The output layer represents learned data or prediction probabilities. The overall theme conveys the flexibility and adaptability of neural network architectures.\"/>\n",
"\n",
"*General anatomy of a neural network. Neural networks are almost infinitely customisable. The main premise is that data goes in one end, gets manipulated by many small functions in an attempt to learn patterns/weights which represent the data to produce useful outputs. Note that \"patterns\" is an arbitrary term, you’ll often hear \"embedding\", \"weights\", \"feature representation\", \"representation\" all referring to similar things.*\n",
"\n",
"Where the input layer takes in the data, the middle layer(s) perform calculations on the data and (hopefully) learn patterns (also called weights/biases) to represent the data and the output layer performs a final transformation on the learned patterns to make them usable in human applications.\n",
"\n",
Expand Down Expand Up @@ -4640,6 +4642,8 @@
"source": [
"### TK - The magic of transfer learning\n",
"\n",
"UPTOHERE\n",
"\n",
"**Transfer learning** is the process of getting an existing working model and adjusting it to your own problem.\n",
"\n",
"This works particularly well for neural networks.\n",
Expand All @@ -4664,7 +4668,7 @@
"* [`Hugging Face Models Hub`](https://huggingface.co/models) - A large collection of pretrained models on a wide range on tasks, from computer vision to natural language processing to audio processing. \n",
"* [`Kaggle Models`](https://www.kaggle.com/models) - A huge collection of different pretrained models for many different tasks.\n",
"\n",
"TK - image of where to find pretrained models\n",
"TK - image/table of where to find pretrained models\n",
"\n",
"> **Note:** For most new machine learning problems, if you're looking to get good results quickly, you should generally look for a pretrained model similar to your problem and use transfer learning to adapt it to your own domain.\n",
"\n",
Expand Down

0 comments on commit 921451b

Please sign in to comment.