Skip to content

Commit 09130e4

Browse files
authored
Update README.md
1 parent 4ed24bb commit 09130e4

File tree

1 file changed

+3
-68
lines changed

1 file changed

+3
-68
lines changed

README.md

Lines changed: 3 additions & 68 deletions
Original file line numberDiff line numberDiff line change
@@ -1,68 +1,3 @@
1-
## CrossyRoad
2-
EXE is under Execuatable folder
3-
4-
## Document
5-
More Document about MLagent is at https://github.com/Unity-Technologies/ml-agents/blob/release_19_docs/docs/Readme.md <br />
6-
Useful Doc:
7-
- [API Docs/Python API Documentation](https://github.com/Unity-Technologies/ml-agents/blob/release_19_docs/docs/Python-API-Documentation.md)
8-
- [API Docs/How to use the Python API](https://github.com/Unity-Technologies/ml-agents/blob/release_19_docs/docs/Python-API.md)
9-
- [Python Tutorial with Google Colab/Using a UnityEnvironment](https://colab.research.google.com/github/Unity-Technologies/ml-agents/blob/release_19_docs/colab/Colab_UnityEnvironment_1_Run.ipynb)
10-
- [Python Tutorial with Google Colab/Q-Learning with a UnityEnvironment](https://colab.research.google.com/github/Unity-Technologies/ml-agents/blob/release_19_docs/colab/Colab_UnityEnvironment_2_Train.ipynb)
11-
12-
## Installation
13-
1. create an enviroment with **Python 3.6 or 3.7**
14-
2. Install the pytorch from https://pytorch.org/get-started/locally/
15-
3. Install the mlagent with pip
16-
```
17-
python -m pip install mlagents==0.28.0
18-
```
19-
4. Install importlib-metadata
20-
```
21-
pip install importlib-metadata==4.4
22-
```
23-
More Installation Detail at https://github.com/Unity-Technologies/ml-agents/blob/release_19_docs/docs/Installation.md
24-
25-
## Usage (Command Line)
26-
Run the MLAgent Default Model(PPO/SAC) by Anaconda command prompt under the folder with exe
27-
```
28-
mlagents-learn <config path> --env=<exe name> --run-id=<run_name>
29-
```
30-
It should be like
31-
```
32-
mlagents-learn config\player_config.yaml --env="CRML" --run-id=test
33-
```
34-
35-
## Usage (Python)
36-
To load a Unity environment from a built binary file, put the file in the same directory
37-
as enviroment(exe), run:
38-
```python
39-
from mlagents_envs.environment import UnityEnvironment
40-
# This is a non-blocking call that only loads the environment.
41-
env = UnityEnvironment(file_name="CRML", seed=1, side_channels=[])
42-
# Start interacting with the environment.
43-
env.reset()
44-
behavior_names = env.behavior_specs.keys()
45-
...
46-
```
47-
more Details at https://github.com/Unity-Technologies/ml-agents/blob/release_19_docs/docs/Python-API.md
48-
49-
## Action Space
50-
Continuous Action: 0 <br />
51-
Discrete Action: 1 <br />
52-
- Branch size: 5 <br />
53-
0: No Movement/1: Front/2: Back/3: Left/4: Right
54-
55-
## Observation Space
56-
Total size: 60 <br />
57-
30 feature obsered and with 2 stacked vector.
58-
- size 2: Player Coordinate(X,Z)
59-
- size 4: The type of line which relative to player(previous,current,next two)<br />
60-
type 0: Grass, 1: Road, 2: Water
61-
- size 2: The Obstacles Coordinate(X,Z)<br />
62-
3 obstacle observed per line. 6 feature per line. Total 24 feature.
63-
64-
## Changelogs
65-
- v2.1: Reward should add correctly when beating the high score
66-
- v2.0: Observation size now change to 60.<br/>
67-
add Player coordinate, Line type, Obstacle coordinate to observation
68-
- v1.0: Executable Create. Observation space size = 3
1+
## CrossyRoad Unity
2+
Unity source code <br/>
3+
Find the Executable File [here](https://github.com/Introduction-to-Machine-Learning-Team4/Executable)

0 commit comments

Comments
 (0)