Skip to content

To allow batchnorm layers with frozen running stats #472

@EmmyFang

Description

@EmmyFang

🚀 Feature

Is it possible for the privacy engine to allow for batchnorm layers when we freeze their running stats (i.e., all batchnorm layers in .eval() mode)?

Motivation

When we want to use transfer learning and the pretrained model has batchnorm layers, it would be helpful if we can still use the privacy engine by freezing their running stats and treating the stats as constants.

Pitch

It would be helpful if the privacy engine allows for modules like batchnorm with frozen running stats, especially when we want to use some pretrained models with batchnorm layers. Thank you in advance!

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions