Skip to content

Commit 82a8c7a

Browse files
authored
update julia syntax highlighting in README (#183)
1 parent beb18bd commit 82a8c7a

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -113,7 +113,7 @@ This package does not intend to reinvent a fully usable reinforcement learning A
113113

114114
As of this writing, all actions in all environments are discrete. And so, to keep things simple and consistent, they are represented by elements of `Base.OneTo(NUM_ACTIONS)` (basically integers going from 1 to NUM_ACTIONS). In order to know which action does what, you can call `GW.get_action_names(env)` to get a list of names which gives a better description. For example:
115115

116-
```
116+
```julia
117117
julia> env = GW.SingleRoomUndirectedModule.SingleRoomUndirected();
118118

119119
julia> GW.get_action_names(env)
@@ -130,7 +130,7 @@ This package does not intend to reinvent a fully usable reinforcement learning A
130130

131131
Each environment contains a tile map, which is a `BitArray{3}` that encodes information about the presence or absence of objects in the grid world. It is of size `(num_objects, height, width)`. The second and third dimensions correspond to positions along the height and width of the tile map. The first dimension corresponds to the presence or absence of objects at a particular position using a multi-hot encoding along the first dimension. You can get the name and ordering of objects along the first dimension of the tile map by using the following method:
132132

133-
```
133+
```julia
134134
julia> env = GW.SingleRoomUndirectedModule.SingleRoomUndirected();
135135
136136
julia> GW.get_object_names(env)
@@ -155,7 +155,7 @@ Here is an example:
155155

156156
In order to programmatically record the behavior of an agent during an episode, you can simply log the string representation of the environment at each step prefixed with a delimiter. You can also log other arbitrary information if you want, like the total reward so far, for example. You can then use the `GW.replay` functiton to replay the recording inside the terminal. The string representation of an environment can be obtained using `repr(MIME"text/plain"(), env)`. Here is an example:
157157

158-
```
158+
```julia
159159
import GridWorlds as GW
160160
import ReinforcementLearningBase as RLBase
161161

0 commit comments

Comments
 (0)