Many of the security and privacy mechanisms we build - permission prompts, security warnings, privacy policies - make one critical assumption: the end-user is an adult with agency to make their own decisions. Children, and especially children in schools, operate in a different security and privacy context than the general-purpose online tools they use. Young students can't evaluate security risks or consent to data sharing, but we give them the same security warnings and privacy controls that confuse adults.
Alex Smolen (@alsmola) will join us to discuss his latest work, insights, and also tell us a little bit about his background. Alex currently works at Clever and previously was at Twitter & Foundstone.
When you design identity and access management (IAM) systems, consider psychology and sociology in addition to computer security. The goal of this talk is to describe the human-computer interaction problems presented by IAM and three real-world patterns with open-source implementations for managing AWS IAM in an organization.
You might think application security and usability are a zero-sum game. Strong password policies, tight access controls, and cycle-burning cryptography improve system security but hamper the user experience. From a security advocate's perspective, it's important to minimize risk, even if it makes a system hard to use. But what if introducing strict security mechanisms actually increases risk? When do security and usability complement, rather than detract from, each other?
With daily code releases and a growing infrastructure, manually reviewing code changes and protecting against security regressions quickly becomes impractical. Even when using security tools, whether commercial or open source, the difficult work of integrating them into the development and security cycles remains. We need to use an automated approach to push these tools as close to when the code is written as possible, allowing us to prevent potential vulnerabilities before they are shipped.