-
Notifications
You must be signed in to change notification settings - Fork 467
Add LayerNorm support for Vivado #1110
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from 60 commits
c4c818b
3ee64d1
5626a1a
d51f8a9
89025a2
d76cf60
56811de
45cd493
1402f48
430b9ea
97f3e8d
52cc7e8
3961f97
3533999
d2f0df6
6aaa5ed
3b7a288
130092d
09b0ba0
b49fffd
5324a11
bf8c788
b6be2c4
2472b7d
97e71e9
5ed4a76
5d28f58
2fc68d0
b5c95cf
3b8aa8d
d28b24c
de79bb9
6c23326
20a0199
ddccde2
afbe00b
dedf96c
a9de9cb
49313d3
1156ba5
17e0048
63891fd
8dccac6
595cc71
5f3ec00
5697334
d2e27b8
a149f2e
552fa83
69f26bc
be5f5a4
adf7356
8437581
39ab36c
b5b82e2
f3ff077
0f08e7a
21049e7
cbd88bd
0fe0ec3
0d96cb0
c6d9e1d
dcefd6d
1e55ca0
564fce5
59a691e
c8d5df1
17f70e6
e1885b1
08ce6f1
c74736b
4586b92
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -21,6 +21,7 @@ | |
GarNet, | ||
GarNetStack, | ||
Layer, | ||
LayerNormalization, | ||
Pooling1D, | ||
Pooling2D, | ||
SeparableConv1D, | ||
|
@@ -558,6 +559,21 @@ def init_softmax(self, layer): | |
len(layer.get_input_variable().shape) == 1 | ||
), 'Softmax with io_parallel strategy cannot be used on multidimensional tensors.' | ||
|
||
@layer_optimizer(LayerNormalization) | ||
def init_layernormalization(self, layer): | ||
if 'table_t' not in layer.attributes: | ||
layer.set_attr( | ||
'table_t', NamedType(name=layer.name + '_table_t', precision=FixedPrecisionType(width=16, integer=6)) | ||
) | ||
if 'table_size' not in layer.attributes: | ||
layer.set_attr('table_size', 4096) # table size | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. These attributes should be set as default in Also, is 4096 necessary for this implementation to work? All other tables are 1024. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Unfortunately according to my tests, 4096 is necessary to achieve absolute differences of < 0.05 in accuracy. |
||
if 'table_range' not in layer.attributes: | ||
layer.set_attr('table_range', 1.0) # table range | ||
if 'mean_t' not in layer.attributes: | ||
layer.set_attr( | ||
'mean_t', NamedType(name=layer.name + '_mean_t', precision=FixedPrecisionType(width=19, integer=6)) | ||
) | ||
|
||
@layer_optimizer(Embedding) | ||
def init_embed(self, layer): | ||
if layer.attributes['n_in'] is None: | ||
|
rianbrooksflynn marked this conversation as resolved.
Show resolved
Hide resolved
|
rianbrooksflynn marked this conversation as resolved.
Show resolved
Hide resolved
|
Uh oh!
There was an error while loading. Please reload this page.