Compliance Uncomplicated Episode 9: Empowering a Security-First Culture With Polygon
In our latest episode of Compliance Uncomplicated, Drata’s SDR Manager, Kayla Cytron-Thaler, and I had the pleasure of hosting Meryll Dindin, the co-founder and CTO of Polygon—a psychology practice providing remote diagnostics for learning differences.
Meryll's diverse and global background has shaped his entrepreneurial journey that led him to building Polygon. Having worked extensively at the intersection of artificial intelligence (AI) and healthcare, Meryll gained valuable experiences that have shaped his approach to entrepreneurship, operational excellence, security and compliance, and his views on AI and cybersecurity.
In this episode recap, we'll highlight the key takeaways from this discussion and give you a taste of Meryll's insightful journey in entrepreneurship, neuroscience, compliance, and AI.
The Origin and Mission of Polygon
The company's origin and mission comes from a heartfelt and deeply personal place. Polygon, originally known as Thrive Education, was founded by Meryll and Jack, his co-founder and CEO, while getting their masters at the University of California, Berkeley. After university, Jack was diagnosed with dyslexia.
As Meryll pointed out, it was difficult to come to terms with “realiz[ing] that the later you are diagnosed, all of the wrongdoings have already happened."
And we’re not talking about a small portion of the population here, either. Learning differences affect 20% of the population, and they include dyslexia, dysgraphia, dyscalculia, ADHD, and autism spectrum, among others.
Being deeply immersed in neuroscience and dementia studies at the time, Meryll noticed the interconnections between his work and the need for more accessible evaluations for learning differences.
“Our goal at the time was really to unite on this notion of creating a product that would be accessible to everyone at the earliest age possible, to evaluate for learning differences. Polygon's mission is really to enable every person with a learning difference to reach their full potential."
Redefining the Terms
Meryll's vision for Polygon extends beyond providing a product for evaluation. It also involves challenging the one-size-fits-all approach to learning in the education system and the job market. The narrative shift from the term "learning disabilities" to "learning differences" was important for Meryll and his team.
And fun fact: “Polygon” isn’t just a random choice of name. Polygon’s name nods to the importance of acknowledging and accommodating individual learning differences rather than forcing everyone to fit into a universal mold.
“The academic system and the job market has created the path for the average person. What they think the average person is, we call the ‘perfect circle.’ Except, in reality, everyone is kind of their own shape. They can look like a triangle or a square… and everyone is trying to bring that shape within the circle mold."
Polygon’s Commitment to Compliance
Meryll shared their ongoing commitment to complying with necessary regulations in their industry, highlighting the importance of ensuring data privacy and security for their clients especially due to the nature of their work.
“We make promises to our clients that we're going to take care of the sensitive data. It's very important for us to build [the technology] with compliance inside of it."
Meryll also detailed their journey to becoming COPPA and HIPAA compliant, emphasizing that building their technology from the start with the end goal of compliance helped ease the process.
“It was kind of a rollercoaster to get there but given we really started the process of building our technology with that end goal in mind, it was not as strong of a problem. Whereas other entrepreneurs I have met in the past have had to pivot their own technology to make it a possibility."
Process and Team Training
A crucial part of Polygon's compliance process is the training of their team. Ensuring proper implementation, establishing standardized processes, and building relationships with third-party providers are among the key steps they’ve taken.
"We ensure privacy and security of our client's data, proper implementation, processes, relationship with third providers… the big thing that we've had to put in place is the training of the team."
Additionally, they’ve implemented measures to ensure that issues can be appropriately escalated and that actions are documented. Meryll said, "[The team knows] how to escalate issues, and they know that things need to be documented in most cases."
The Importance of Trust in Mental Healthtech
In the mental health space, trust is paramount. Meryll emphasized the need to build and maintain this trust, necessitating a proactive approach and celebration of compliance milestones.
"In the world of mental health—especially with a few hiccups with other startups in the past—you want to build this trust and make sure that doesn’t happen again, and you want to be proactive. Internal celebration and external celebration of those compliance milestones is important to us."
He also pointed out the catastrophic cost of data leaks, not only in terms of financial loss but also the potential destruction of client trust and personal reputation. He said, "Leak of data is super expensive. One is sufficient to lose all the trust with all your clients. I would absolutely not be proud of being the CTO of a company where data leaks."
Meryll views compliance as an ecosystem—one that helps build trust and promotes business growth. Plus, maintaining a good compliance posture is also a point of personal pride, Meryll added.
Making Security the DNA of Your Company
In the discussion, Meryll outlined several key strategies for fostering a security-first culture in a company. The most important, he noted, was the commitment from the leadership.
“The founders need to be well-versed in ecosystem security, understand the basis of vocabulary, or can get surrounded by advisors well-versed in that space."
He suggests using third-party tools, such as Drata, can be highly beneficial in helping companies build a robust security framework. However, it's crucial that leadership also leads the charge, followed by management, in order to embed a security-focused mindset into the company's culture. Here are some additional tips from Meryll on fostering a security-first culture:
1. Build in Security Practices From the Beginning
“It needs to be part of the culture so that everyone understands it. It then becomes processes as you build a company and start hiring the people—it's part of the employee handbook, it’s part of the onboarding. They get used to it from day zero, and ultimately it comes back to getting this clear vision: The importance of security understood by everyone."
Meryll’s pro tip is that doing it from the beginning—onboarding a new employee with 2FA, for example—seeds that idea of security, making it easier for long-term mindset and adoption. It's easier to teach proper practices from the beginning than it is to correct bad habits later.
2. Involve the Company, and Celebrate Compliance Milestones Together
An essential aspect of the security-first culture is involving every team in the company. When a new security process or policy is implemented, everyone should be included (I mean, they have to get trained after all). Similarly, celebrating compliance milestones should be a company-wide event, as this fosters pride and understanding of why compliance is so important.
Meryll pointed out that a successful compliance program is something the whole company should be proud of. "They went through the audit, they’re part of the success. Because they did the training, right? They did the policy right, they did the processes right, they understood the concept and the scope of it,"
3. For the Love of All Things Compliant, Use Strong Passwords
“Just the basic, best practice, strong password, please,” Meryll emphasized. “It's crazy, this world."
AI's Transformative Role in Cybersecurity
Continuing the conversation, we explored the intersection of AI and cybersecurity with Meryll. Given the rapid advancements, the integration of AI into cybersecurity, this has naturally emerged as a topic of significant importance.
"AI has tremendously improved cybersecurity, allowing automated parsing of logs and database transactions for anomaly detection.” This automation has replaced tedious manual work, increasing efficiency and effectiveness in identifying potential threats.
However, AI isn't without its drawbacks. Meryll voiced concerns about AI's potential to empower cybercriminals. With advanced AI tools, tasks previously thought to be difficult are now within reach for ill-intentioned actors as well, which increases cybersecurity risks.
"Password cracking—once a complicated task, anticipated to be solved by quantum computing—is now a reality due to AI," he pointed out. (*Cough* All the more important to use strong passwords!)
The Benefits of AI in Security Compliance
Meryll broke down the current benefits he’s seen with the increasing popularity of AI, particularly its scalability and ability to encapsulate the knowledge of numerous experts.
Some additional benefits include database anomaly detection, a huge increase in accessibility (for example, using AI to build tools you’d previously need experts for), and the emergence of AI as the "new white hackers" (referring to AI systems that can penetration test technologies, identify weak spots, and suggest improvements).
“Even when you have the vocabulary, you may not have the technical expertise to implement all the safeguards for your technology. But then you could have AI telling you what is lacking and fix it. And it's always better when you have someone pinpointing the issues than just you trying to figure out what the issues are without knowing. Basically you don't know what you don’t know.”
Ultimately, AI is still in its early stages, and predominantly owned by large corporations that typically have stringent compliance regulations. However, Meryll stressed the role of open-source communities in shaping the future of AI.
"The thing with open source communities is that you have a conversation not incentivized by money, but incentivized by what's the right thing to do.” This open-source, communal mindset could help steer the direction of AI development toward safer and more beneficial use cases.
Listen to the Episode
Learn more about Polygon’s cutting-edge work on their website and stay in the loop with them on LinkedIn, Twitter, and Medium. If you're interested in delving deeper into topics surrounding learning differences, Polygon's blog is a must-read.