A Ridiculously Basic Primer on Consent Culture

What is Consent Culture? At its simplest, consent culture is a culture in which asking for consent is normalized and encouraged in popular culture. This includes many parts of life, not just sexual consent. We should treat asking consent as normal in lots of activities. These include, but aren’t limited to: