This week, we welcome Christian Geyer, founder and CEO of ACTFORE, for a deep dive into the world of automated breach response. From his early days in military intelligence to founding a disruptive forensics firm, Christian shares how his diverse experiences shaped a data‑driven approach to tackling one of corporate America’s thorniest problems: managing the chaos of a data breach. Along the way we hear about secret clearances, cabinet making, and the humble beginnings of a startup launched from a literal closet.
Christian’s journey is anything but ordinary. Recruited straight into an intelligence agency at nineteen, he cut his teeth on top‑secret work before supporting Navy research labs with data dashboards that informed mission‑critical funding decisions. When an injury sidelined him from field ops, he turned to carpentry, framing houses and crafting cabinets during the mid‑2000s flip boom. That hands‑on trade remains Christian’s reminder that some skills can’t be automated, even as he builds AI to do the heavy lifting elsewhere.
He carried that disruptive spirit into Crypsis Group, undercutting the incident response market with half‑price forensics and skyrocketing revenue from zero to $20 million in four years. COVID’s budget cuts then prompted a pivot: leveraging ActiveNav’s data‑discovery engine to automate breach notification. What began as a side project in a shared office closet evolved into ACTFORE, a company that in a single four‑and‑a‑half‑day engagement processed 3 million patient records across 2 000 endpoints, cementing its reputation for speed, accuracy, and onshore security.
At the heart of ACTFORE’s offering is an automated extraction platform powered by “infant AI” tailored to each client. Rather than shipping data overseas for human review, Christian’s team uses software instances—deployed globally or on‑premise, to scan, fingerprint, and parse structured and unstructured files. Their Trace tool brings the automation into one‑click point‑and‑extract workflows, slashing keystrokes and crushing review timelines by weeks without sacrificing human‑in‑the‑loop oversight where it matters most.
Looking ahead, Christian warns of a new insider threat: AI models trained on proprietary data that could be weaponized from within. While ACTFORE continues to focus on reactive breach response, Christian sees advisory and proactive scanning services on the horizon, particularly in regions with more robust data regulations. As breach frequency rises and AI proliferates, ACTFORE aims to stay ahead of the curve, turning its combination of automation, human expertise, and a closet‑born scrappiness into the next frontier of cybersecurity.
Listen on mobile platforms: Apple Podcasts | Spotify | YouTube
[Special Thanks to Legal Technology Hub for their sponsoring this episode.]
https://open.spotify.com/episode/6eHBRiw6oI08ixL7P42wEL?si=5W-7OphdSuiT-GMrM5nI4Q
Blue Sky: @geeklawblog.com @marlgeb
Email: geekinreviewpodcast@gmail.com
Music: Jerry David DeCicca
Transcript
Marlene Gebauer (00:00)
Hi, I’m Marlene Gabauer from The Geek in Review, and I have Nikki Shaver here from Legal Technology Hub, and she’s got some very exciting news about the upcoming Legal Technology Hub conference, right? I don’t want to spoil it. I’m going to let you say. ⁓
Nikki Shaver (00:13)
Thank you. Thank
you. We are super excited to be able to announce Jae Um as the keynote speaker for our LTH flagship conference, the New York Innovation and Legal Tech Conference, which is being held this year on September 18th in New York.
Marlene Gebauer (00:20)
Nice!
Nikki Shaver (00:30)
⁓ Many of your viewers, Marlene, I’m sure will know that Jae is really one of the foremost experts in the world on law firm management and pricing strategy, having previously held the position of global chief strategy officer for Baker & McKenzie. She’s now the founder of Lumio, an organization providing insights and tools for managing partners, unlocking potential and enhancing law firm performance. One of the trends we at Legal Tech Hub have seen in the market this
year in particular, is a dedication by firms, a new dedication, I would say, to reviewing pricing strategies and rethinking the value of legal work. And obviously, that kind of a trend, that kind of a shift, that sort of an openness to looking at that key area of law firm management in new ways is really noteworthy and leads to all kinds of repercussions around the way we price and bill legal work, the way we look at
performance
internally, it really gives rise to lot of complexity and true business model change. So Jae is going to set the scene at our conference for critical conversations across verticals on that most important topic of the day and the repercussions from it. And we’re just very excited to have her find out more and register for the event at legaltechnologyhub.com. If you look at our top menu, there’s a drop down for
or events, including LTH events. You can just click on that and register for a conference. We keep our prices low because our objective is really to have the important conversations of the day with everyone involved in legal and legal tech. And we really hope to see you there.
Marlene Gebauer (02:13)
Yeah, well, I’m excited for you and I am definitely looking forward to it.
Nikki Shaver (02:18)
We can’t wait to see you there, Marlene.
Marlene Gebauer (02:28)
Welcome to The Geek in Review, the podcast focused on innovative and creative ideas in the legal industry. I’m Marlene Gabauer.
Greg Lambert (02:35)
And I’m Greg Lambert and this week we are joined by Christian Geyer who is the founder and CEO of ACTFORE. And Christian, we’ve been talking probably since February, March, you know, ⁓ I finally got it on here. So welcome to the Geek in Review.
Marlene Gebauer (02:45)
Yeah.
Finally making it happen, this is good.
Christian Geyer (02:53)
Appreciate it. Yes, it’s a few months in the works and I’m happy to be here.
Greg Lambert (02:57)
No pressure, no pressure.
Marlene Gebauer (02:58)
Yeah, right? Not
at all. So Christian, before we dive into talking about ACTFORE I understand you have a very interesting background that might not be the traditional tech entrepreneur story. So tell us about your experience working for the US military, how you came to be part of the Crispus Group, and tell us a little bit about your carpentry.
Christian Geyer (03:22)
Uh, interesting, hopefully it’s interesting for me. I don’t know, a long, lengthy road to becoming a CEO and founder, starting out, I mean, this is back when I was 18, 19 years old, I was recruited right out of college. So you could consider me almost like a college dropout at one point, but I was recruited into the intelligence agency, moved over there, did a couple of years supporting the intelligence agency.
basically doing secret and TS type work. And ⁓ from there, I transitioned from the agency over to supporting the Navy. So I worked with the Center for Naval Analysis, as well as ⁓ an FFRDC, which is the Center for Naval Analysis, but also the Navy Research Labs, NRL. that’s when I actually went back to college, got my first bachelor’s degree, my second bachelor’s degree, and then eventually got my master’s.
It was there where I founded almost my love and my passion for data. So I was working on pulling disparate forms of data from various platforms and then converting them into a dashboard for all of leadership. And we are basically managing and utilizing that platform at all levels of management throughout the Navy to then inform themselves about how they’re spending money, how they’re utilizing it, how best to allocate and fund projects.
and that was kind of where the impetus of just data and utilizing large, vast quantities of data became sort of key because after that, when I went to the FFRDC, we utilized, a Hadoop framework to then grab tons of ship data to then crunch, Navy ship vessel, information into something that was actually understandable or.
consumable by our project managers. And that was a project that I led and developed. And it was, I ended up using that framework to then go into predictive analytics. And we were predicting kind of the 2018 fiscal cliff that the Navy went through and the government, and it allowed us to circumvent a lot of sort of downward spiraling of the business that most businesses felt during the fiscal cliff era. But, ⁓
That’s what led into my next venture, was into the Crypsus Group. And that was an incident response forensics firm. at the time, forensics was kind of new. And I want to say it was new in the incident response era, but it was about the average cost for an hourly wage was, or an hourly billable hour was right around $500 an hour.
And we wanted to disrupt that market. So we went in with the Cripsys group and we were charging out, I think our first hour was $220 an hour. So we were undercutting the market by 50 % or more. And we sort of just took off from there. was a, it was a rapid growth. went from zero in revenue or our first kind of our first year in business was a million in revenue. And then by the time it was purchased by Palo Alto group, was north of 20 million. So in
four years time, it went from less than a million over to 20 plus. And ⁓ probably prior to that, ⁓ no, it was slightly. I think the idea and some of let’s say the articles out there on the insurance world, they always talk about how the cost of breach response is increasing. So the best way for us to actually
Greg Lambert (06:40)
Did you get those rates? Did you get those rates up?
Marlene Gebauer (06:45)
That’s gonna say it’s like I think rates are higher now.
Christian Geyer (07:00)
penetrate the market is to lower the cost. So that was where Crypsis, went in half the price and it was something that we did here at ACTFORE is when we came into founding it, it was how do we make things cheaper for the insurance providers? How do we make things better on shore? And it was almost a similar, were cutting the market price by roughly half and we were meeting and actually beating overseas vendors, which were providing them with
tons of armies of human reviewers.
Greg Lambert (07:32)
You know working with insurance groups, the first thing they want to hear is how you’re going to cut rates. Because they’ll cut them for you if you don’t. So I love the fact how quickly you jumped into military mode with all the acronyms right off the bat there. It reminds me of my army days. Well, one last thing though, then ⁓ talk to us about this carpentry that we keep hearing about.
Christian Geyer (07:38)
Thank
Marlene Gebauer (07:56)
Yeah, we want to hear about the carpentry.
Christian Geyer (07:56)
Oh,
no, guess I was, I was injured probably back in, gosh, it was 2006, seven. And I had to kind of find a new, a new passion in life. Then I ran into carpentry and at the time it was 2005 before the housing bubble burst.
And I was, I was buying, flipping and fixing up houses. So I got into carpentry, ⁓ framing, finished carpentry, as well as like cabinet making. So a few years of that before, before the bubble burst. Yup.
Greg Lambert (08:29)
All right.
That’s skill that AI won’t replace, I think. You’ll be fine. You’ll be fine if you have to fall back on that.
Marlene Gebauer (08:35)
Never goes away.
Christian Geyer (08:42)
or maybe when I retire.
Greg Lambert (08:42)
Well,
Marlene Gebauer (08:45)
Exactly.
Greg Lambert (08:45)
yeah. So ACTFORE actually emerged from Active NAV, I think back in late 2022. And the mission there was to kind of reinvent the breach notification and data forensics. And so, I mean, you kind of started on that. Do you mind expanding on the origin story of ACTFORE and, you know, kind of the gaps that you saw in the market?
breach responses.
Christian Geyer (09:11)
Of course. Let’s say, I guess let’s set the stage of where we were in 2022 or late 2021. was right after COVID and what ActiveNav is, is there an enterprise software tool used to identify sensitive data in clients’ environments? And at the time I was their global COO. So what we were doing was selling our software worldwide to anyone from Fortune 500.
the global 1000s to identify and clean up their sensitive data. But COVID cut a lot of budgets. made selling enterprise software extremely difficult. So just like everyone else, we went through massive cuts, layoffs, restructurings. And it was two years in the works that post COVID that we were running sort of short on cash reserves.
We did everything that we could to maintain the business and we needed something new. We needed something that could actually sell in a, in an environment where proactive data cleanup wasn’t necessarily on everyone’s, let’s say pocketbook or forefront of their mind. So we launched ACTFORE, which primarily came out of
what I was finding out in the market, I went out and I, spoke to a couple of people from my past contacts at the Cripsys group. And what came about was what was the biggest pain point in incident response at the moment. And it was about the breach notification process. what we are hearing from, let’s say insurance carriers for one is that vendors were coming in and they would give them a quote of let’s say a hundred thousand dollars.
⁓ after three months, they would receive another quote or an uplift for another half million and then another half million a few months after that. And before they knew it, a, what was promised to them of a one to two month delivery ended up turning out to be 12 months long. And it was ballooning to 10 X the actual original quote. And that was the pain point that the insurance companies felt. Then I went out and I.
Greg Lambert (11:23)
It’s a story as old as time, I think.
Marlene Gebauer (11:25)
Exactly, yeah.
Christian Geyer (11:27)
Yeah, the constant uplifts and I believe me I lived through that with the government and how we actually went to a fixed price model in the government to shift the risk from the government holding the bill to then the actual vendor ⁓ or the subcontractor performing it at a fixed cost. But back to sort of how it came about, I also spoke to the law firms, the
the breach coaches or the lawyers that were actually handling the cases. And their feedback was what was being done in the market was of poor quality. So what were the insurance carriers paying for was overseas work. Basically we would ship our data over to, let’s say areas like India for mass processing by armies of human reviewers. And the actual
product that they were getting in return had errors throughout. It required tons of cleanup where the law firm had to put their associates and paralegals onto those reports to then clean them up. Just to get them point at which they could present it to their client to then get those breach notification letters out. So that was their pain point was that the quality of the work was either unacceptable, required additional
manipulation and also their timeline. They were constantly being dragged out of timeline. mean, if you’re engaging with a vendor and they’re telling you one to two months and it ends up being 12 months, how does that reflect on your relationship with your client that you’re representing? So there is that pain point. And then obviously the one from the client perspective. So gathering all that information.
We then created, I knew that there was a better way of doing it. didn’t in this era, in this era of automation, why are we doing it all manually? So then we used our software at ActiveNav which discovered sensitive information. And I know that in the backend, if you can discover it, why aren’t you storing it to then report on later? So we took the actual core code.
of ActiveNav and manipulated or augmented it slightly to then continue to store all the information that it was scanning, which allowed us to automatically extract social security numbers, dates of birth, names, addresses. Instead of having to have a human reviewer type that all in, we could augment existing code out there and turn it into an automated process. And I would say within our first engagement, we were brought
what was, I asked one of my friendlies in the insurance and legal industries to bring us an easy project that we could display our product. And they brought us probably one of the most difficult ones, which yeah, it was, it was a 2000 endpoint breach at a hospital network. And their hospital, their 2000 endpoints were spread across three different states, all remote in COVID.
Greg Lambert (14:23)
Yeah, easy is relative.
Marlene Gebauer (14:36)
Mmm.
Greg Lambert (14:38)
Sounds easy to me.
Christian Geyer (14:38)
⁓ so, so we had to go out.
Marlene Gebauer (14:39)
Wow.
Christian Geyer (14:41)
Yeah. We had to touch 2000 endpoints, pull back all the data, process it. And the kicker was their breach notification timeline for fees or for penalties was five days away. That’s why no one would touch it. No one would say that they would do 2000 endpoints in five days and deliver a notification report. And, surprisingly we did it in.
Marlene Gebauer (14:41)
Simple, just simple.
Christian Geyer (15:05)
I think it was four days and 20 hours that we ended up delivering. And we delivered a, I it was over a 3 million patient records, all the information in them. And we did that all in four and a half days, four and three quarter days. And at that point we knew we had magic, the law firm, which is one of our best clients to even today.
Greg Lambert (15:09)
in 23 hours and 59 minutes.
Marlene Gebauer (15:11)
Hahaha.
Christian Geyer (15:32)
They continued to push additional partners on us and here we are today, now 1,300 plus engagements last count. We’ve had 1,300 plus deployments in two and a half years, which it’s pretty impressive from what I’ve seen.
Marlene Gebauer (15:49)
Yeah,
I’m curious. you’ve told us about why an onshore model versus an offshore model. You’ve touched on why AI first. You’ve mentioned a few types of clients. Can you dig in a little more? What’s your target audience, core services, the products, and features that ACTFORE delivers?
Greg Lambert (16:13)
Yeah, and you talked a bit about automation, are you also in the age of AI, implementing AI within the process now?
Christian Geyer (16:22)
Yeah, so when we talk about this, probably four different questions in there. Let me see if I can get them right. So, target audience. I think I kind of mentioned three of them in that earlier we’ve got a three pronged clientele basis, the insurance carriers, which are the ones who will ultimately pay the bill through cyber insurance policies.
Greg Lambert (16:27)
yeah, we’re known for our multi-level
Marlene Gebauer (16:29)
We’re known for compound questions.
Christian Geyer (16:47)
You’ve got the law firm who is representing the client and we actually are engaged through the lawyers that were under attorney client privilege. We’re an extension to allow them expertise in data processing. And then you’ve got the third one, which is actually the breached client. So our target, our target audience is kind of all over the place or it’s three pronged. but what I would say is it’s for somebody that’s seeking fast, notification.
accurate notification lists and potentially even more cost effective. Why I say potentially is because we have a model where we’re software license based or we’re software instance based. Meaning if we could throw one software instance at a project, which is one software manned by two different people, and then that can actually run your entire data engagement.
⁓ you could throw 10 terabytes at one software instance. It would just take an extended amount of time to process. But what you’ve done is you’ve limited the cost exposure. But then if a client wants to go extremely fast and, I can probably share with you at 10 million file engagement. If you’re so interested, but we did a 10 million file engagement with 15 software instances and it was 10 weeks.
It’s something like that where our target audience could be somebody who’s looking for cost savings, but then also one who’s looking for rapid response. But it’s really all about somebody who’s looking for an accurate, truly what you’re, you know, what you’re going to get, what you’re paying for. where in an environment where you’re potentially engaging with vendors that go overseas and you don’t know when you’re going to get to report back. You don’t know what the ultimate cost of that engagement will be.
You don’t know how accurate that will be. We’re the ones who provide predictability in the environment. was going to hit on the other. Let’s see. There was three other core services. So, of course. All right. Core services.
Marlene Gebauer (18:44)
Core services, products and features. And then, Gen.ai.
Christian Geyer (18:54)
A company has breached their data stolen. There’s a regulatory requirement to notify every individual that’s impacted. It could be either from PII, personal identifiable information or PHI, is health information. We can go even further into GDPR ⁓ FERPA data, is like academic records. But what we do is provide.
When those breached clients come to us and they’ve got mountains of data and mountains of documents, we’re the ones who actually now go through that in an automated procedure to then extract all of the data. may have, let’s say there was a breach last year that impacted nearly two out of every three Americans. So most likely some of your listeners have received a letter in the mail that had said that your social security number, maybe your date of birth and some other information were stolen.
We’re the ones that help go through that client data to produce that. So that’s kind of our core deliverable or core service. Our product is an automated solution. So we, we took the ActiveNav software platform, turned it into our own or augmented it slightly, but then also we, we are adding additional features so I can turn this into one question. So ⁓ our features ⁓ it’s the automated extraction and.
Greg Lambert (20:11)
All right.
Christian Geyer (20:17)
We actually have our own R &D team internally where we actually pour our additional funds back into it. And that’s developing our own AI, our own ML, as well as the identification of let’s say, curvatures and handwritten forms. Then we can extract it and understand what type of information is in there. But we’ve got patent pending.
features where it’s targeted automation or extraction tools where you just point and click and it automatically extracts information without keystrokes. So again, it’s all about getting faster, more efficient with processing. That’s what we’re into is making sure that we’re making incident response and the breach notification process either more cost effective as time progresses and more efficient.
as we move along. I guess, Gen.AI. would say starting from 2022, we knew that AI was the game changer in the industry. So we’ve been working on it. We’ve been implementing it. We’ve been utilizing it since 2022. I think everyone else out there is sort of playing catch up at this point.
sort of competitive advantage that we have is we’ve built our own. Whereas others are using third party sort of API calls another AI model. It could be a backend API call to chat GPT or something of that nature to then help it process sensitive information. The biggest downfall there is cost competitiveness. While we’ve invested in it, we’ve
We’ve sort of built it ourselves so we don’t have ongoing costs. Other vendors are out there utilizing a third party cost, which is significantly more expensive, which at this point we need to figure out how to get costs down.
Greg Lambert (22:07)
Yeah, I know you’ve got, you know, for better or for worse, breaches are getting more frequent, not necessarily less frequent. So you’ve entered the hot market. But there are, you know, there’s other service providers out there, you know, like Epic with their Canopy product, Haystack ID, even LPOs like Integrion have developed these types of, you know, breach workflows. And I think you
you’ve identified a couple of things that make you distinct. So want to dig into the fact that you’re developing your own AI model. Do you mind kind of talking about how you’re developing that? it a pure machine learning type of extractive process, or are you finding ways to lever more of the generative ⁓ AI in your
Christian Geyer (22:59)
Yeah. So let’s say we’re, limiting gen AI simply because what we want to, when you’re talking about what data was exposed, we want to ensure what was truly exposed is what’s being reported on versus in a generative AI’s interpretation of the data. So when we talk about,
AI and refinement of AI. It’s understanding the actual documents that are processing through it, learning from those documents to then better extract the information. we tend to describe it as is we’ve built an infant AI and what we do is we’ve locked that version down.
⁓ we don’t want to train our models on various client data. This is one of our differentiators that helps us in GDPR type environment, European restrictions on AI training in that what we’ve done is we’ve built an infant and we dropped that infant into every single engagement. It learns on that specific client’s information. So when you think about an AI that has to store libraries of data that it’s been trained on.
statistical models of what information is passed through it, it’s storing countless information that it may never ever use again, specific to another client that has a different infrastructure, a different document setup, a different document structure in their onboarding documents, maybe even their payroll processing documents. Whereas what we do is we take that infant, drop it into a single client environment, let it process, let it learn off of that through our assistant learning.
And we’ve, we basically found that we can go significantly faster with an untrained model that’s trained specifically on a client data, then taking a, a ⁓ mass AI that’s been trained over countless and deployments and trying to deploy that in a single client environment and then having to train it. It actually requires significantly more training and hands on keyboard.
if we’re taking a large scale model and trying to fit it into a single client deployment. So that’s kind of our differentiator. That’s why we’re moving significantly faster than our competition.
Greg Lambert (25:14)
Is there a lot more cost involved on your side in dropping an untrained AI in and training it? would it, yeah, or time? I’m curious. That’s very interesting. It makes sense, but I’m curious on the cost side and maybe time-wise.
Marlene Gebauer (25:23)
every time.
Like every time it’s a different. Yeah.
Christian Geyer (25:34)
When we first started, we would deploy it and let it run by itself. One of the key reactions from the market was, a human always needs to be in the loop. ⁓ So exactly. Even if it’s a significantly trained model and you’re dropping a well-trained model into a client environment, you could let it run by itself, let it learn, constantly learn and extract. But the industry wanted hands on keyboard. They wanted a human in the loop.
Greg Lambert (25:43)
You don’t want an infant running by itself.
Christian Geyer (26:01)
So what we did was we actually dialed the AI back and turned it into an infant so that we could deploy it. So then we always had hands-on keyboard. That was actually a refinement that we were able to do on the fly with feedback from all of our customers. And that they felt more comfortable if we had our forensics experts always in front of the keyboards, always looking at what was being extracted. we made a quick…
what would you call it, an option or I forget the football phrase, an audible. We audibled and moved to ⁓ always hands on keyboard, human in the loop model that then leverages that infant AI. when we think about is it more costly or is it more cost effective, in the long run,
It’s more costly, but in the immediate for what the client needs, we’re still cost effective. So there are refinements where we could actually go into the future of just making it a pure AI model. If the industry adapted to that, where they actually didn’t require human in the loop. And if we did that, we could turn an engagement even significantly more sort of efficient.
Marlene Gebauer (27:14)
to talk about Trace, the Trace tool. So the platform can analyze, what is it? guess like a million files an hour. And then in 2025, you introduce Trace to accelerate the notification list extraction. So if you could walk us through how Trace works, how it integrates with the Core Data Mining Engine, and tell us why it cuts weeks off traditional review timelines.
Christian Geyer (27:44)
So, guess some background information on it of how we can actually automate the extraction. We automate via or how we train the model is it looks for patterns of identical files. So if you think the most easy form is a W2, employer in the US provides generally a W2, it’s going to be in a government form. So then that’s easy.
It’s semi-structured in nature. We know where the name is, the address is going to be. We know where your social security number is. Where automation plays a factor is, you know what location those data elements are, so you can extract them at scale. But further on, what if you have a completely unstructured file? A typed out Word document. But if you’ve got a process that’s constantly repeated in your company,
of utilizing that word type format, we can then find structure in that unstructured document. And that’s where our AI platform will actually find identical files based on, if we call it fingerprinting, it’s looking at it at a 10,000 foot view and looking at the words on the document and turning them into just pixels. And then we’re using those pixels in the form of like fingerprint identification that the FBI is using.
So we actually use that to then find like files to then extract. Now where trace comes into play here is there are some documents that you just cannot automate or the cost benefit of automating them declines because there’s maybe only one file of that type. Maybe there’s only 10 files of that type.
Marlene Gebauer (29:26)
What can you
give a type of example? Like what would that be? That type of file.
Christian Geyer (29:31)
⁓ maybe you wrote an email.
Yeah. You wrote an email to, to Greg that basically said, Hey, here’s my soccer team’s information. You want to sign up or here’s, ⁓ here’s my child’s. Yeah. Here’s my child’s information. Yeah. Whereas it’s a personal email that’s stored in a company inbox and you’ve sent that to a coworker and.
Marlene Gebauer (29:42)
wait, all the time.
Greg Lambert (29:44)
Yeah, it’d be volleyball team
for Marlene.
Marlene Gebauer (29:48)
You
Christian Geyer (29:58)
It’s not something that’s going to be repeated throughout the environment. It’s a one-off unstructured document. Now, instead of us having to throw, where the AI could be used on much larger batches while that’s processing, we can then, how do we automate the single files? How do we automate those that go to potentially human reviewers where there’s only one file of a type?
So we send that off to what our new platform is, is Trace, which stands for, this is a mouthful, targeted retrieval and automated content extraction. what it is is it’s a review platform, a la relativity or canopy. But unlike where you’ve got a type in information, let’s say if I, if I got that email, maybe that
Marlene Gebauer (30:35)
Good.
Christian Geyer (30:51)
hacker received that email or stole that email that had your children’s or your child’s sensitive information in it. Traditionally, a human reviewer would have to type out your child’s name, type out their date of birth, their social security number, and then that would be stored in a relativity environment. Well, for us with Trace, it’s now point and click. The second we click on a mouse click onto the person’s name,
It extracts that name. When you click on the date of birth, it extracts it. Click on the social security number and extracts that. So it’s taking what could be a hundred plus key, ⁓ keystrokes down to three. Where if it’s your date of birth or social security number and name, a hundred plus keystrokes down to three. That allows us to go significantly faster.
Marlene Gebauer (31:41)
Yeah, that’s huge.
Christian Geyer (31:44)
That’s what all of our process, all of our R &D teams are working on. How do we go faster and how can we be more efficient in that process?
Greg Lambert (31:55)
I was going to ask you about some customer success stories, but I think we’ve talked about a couple of those. So I want to actually turn the lens internally. ACTFORE as a company look like now versus what it started off with? I mean, how is employee growth? Where are you located? You said you have 1,300 matters going on right now.
to brag about ACTFORE
Christian Geyer (32:27)
let’s see. Where we started out was, we actually, we were actually launched out of a closet. ⁓
Marlene Gebauer (32:34)
Hahaha.
Like a closet closet?
Christian Geyer (32:36)
⁓
Yes, a closet closet. ⁓ So because we were birthed out of a real closet, was because we were birthed out of Active Now. And at the time I was their global COO, when I brought this idea to our board members, they really didn’t like it. They didn’t like the fact that I would be distracted from selling their core.
Marlene Gebauer (32:40)
Okay. Not a small office, but a real closet. Nice.
Greg Lambert (32:43)
Hahaha.
Christian Geyer (33:01)
software, looking at the actual company and making sure that it was being managed and run correctly to then maybe going off and trying a new venture. But what we did was we said, can I work on it from 5 PM to 6 AM? Can I actually use that time? And they agreed. So it was literally three people working inside of a closet in our offices, our ActiveNav offices.
during the 5 PM to 6 AM timeline to launch a company. So it’s kind of like that garage story, but ours is a closet. Thanks.
Marlene Gebauer (33:37)
Mm-hmm.
Greg Lambert (33:40)
not just a closet, but you had to share that closet.
Marlene Gebauer (33:43)
And I liked that there were like certain like
times, like you can only do it between now and now. Like that’s, that’s great.
Greg Lambert (33:46)
Hahaha!
Christian Geyer (33:47)
Yes.
Yeah. So, uh, I think our first month in business, we, uh, we delivered over $500,000 in deals with three people in a closet. Yeah. So, um, we, quickly moved out. We got our own rental space. We went to like a, WeWork and, uh, we quickly, uh, I think within our.
Marlene Gebauer (33:59)
And then they decided, well, it’s not such a bad thing after all. ⁓
Greg Lambert (34:01)
Yeah. Yeah.
Christian Geyer (34:13)
our first real month in business where we said this is our product, we’re going to go to market. We said this isn’t a concept that we’re running out of a closet. We did one million in revenue in our first real launched month. And ⁓ that was out of a WeWork space with 10 individuals. And we were working countless deals at the time. Today, we have our own permanent office space, which is, it’s really nice. We’ve got our
our logo or a name up on the building. So it feels good to have that there. w in terms of employee size, because we’re not your traditional employee and we’re looking for efficiencies in software automation. We are only about 35 individuals. ⁓ we do have some part-timers where we have on the bench just for increased workflow or, or, maybe a large engagement happens. So we actually have upwards of 78.
when you consider all of our part-timers that are on bench, but our core team is about 35. And we’re in our own permanent office space. It feels good. And we have over, I think last count, have over 1500 on-prem processors. So when you think about our servers, when we started off, I think we had roughly about 90 processors given to us or provisioned to us. And then now we’re at 1500 at rest.
and we can spin up countless engagements on a whim. So it feels good.
Greg Lambert (35:42)
Are your clients mostly in the US or the international? What’s your client base?
Christian Geyer (35:49)
⁓ so we can actually be deployed, all around the world. So although we hold a ISO certified or ISO 27,001 certified on-premise air gapped environment in the, Virginia and the U S, ⁓ we can process all of our clients’ data. If they ship it to us or if they uploaded VSFTP to us, we can process it onshore entirely in the U S now for engagements of a global nature.
we can deploy our software on-premise in the client’s environment. Meaning we’ve had global engagements where a Japan server, a European server, and a US server were impacted or countless servers. And what we did was we dropped three software instances, one in Japan, one in Europe, and one in the US. What it allows is the closer that we can be to the actual data, the faster the processing is going to be.
If we had dropped a single instance in the U S and tried to migrate the data from Europe over to the U S you’re adding bottlenecks. Similarly from Japan over to the U S you’ve got bottlenecks. What are our on-premise model allows or our software allows is we can drop that as close to the data as the client can provision for us. So we’ll drop it all around the world. We’ll deploy our software there, man it remotely.
And then process data rest.
Marlene Gebauer (37:16)
Do you run into problems with different types of regulatory requirements regarding data when you’re in different jurisdictions?
Christian Geyer (37:25)
Yeah, so there’s countless regulations around the world. Some that are more robust than the US. Actually, quite a few that are more robust than the US. So yes, we have to abide by local jurisdiction regulations. We actually work with our legal counsel to identify what’s required of us. If you’re deploying on-prem in the client’s environment, then it allows more flexibility.
Marlene Gebauer (37:29)
Yeah.
Most?
Greg Lambert (37:34)
I said,
Christian Geyer (37:52)
If the client has to transmit or would prefer to transmit the data to us and it’s cross border, then we work with their legal counsel to then provide the justification why, let’s say, moving it offshore or moving it cross border to us is more efficient than potentially processing it in their country or sending it to a foreign country for processing via, let’s say, human reviewers.
We work with council to provide all that justification and compliance with local standards.
Marlene Gebauer (38:25)
Where do you see the sort of the next frontier of innovation and breach response and data forensics? So, you know, are there like new AI capabilities or analytics you’re exploring to help clients so that they, you know,
not only understand the exfiltration scope, also make strategic incident response decisions. What do you see in terms of where ACTFORE should be moving in the future?
Christian Geyer (38:57)
So I guess at Legal Week, I was speaking to a couple new AI vendors and the thing that scared me the most, and I spoke with, stockingly, some CTOs as well as CISOs for those AI tools. And what I don’t think they’re realizing is the information that’s stored in the backend of an AI model. What they’re trying to do is create an AI that can be deployed internally into an
a company or a client, a centralized AI repository for all their employees to go out and touch, speak to, to make employee processing significantly more efficient. That’s great in the long run. That’s great that employees can find an information or a knowledge center to get quick information or quick answers to questions. But what that does is it exposes
what I consider the greatest insider threat that we’re going to be facing in the near future. With all these trained AI models that are being deployed in-house in a client environment for all their employees to leverage, what you’re doing is you’re creating a ⁓ very knowledgeable AI on all the clients’ information, on all that company’s information, all their processes, all the data they store.
What we’re focusing on is how to, how to break apart those AI models, how to get past and sort of extract the information that it’s learned. We’re getting ahead of that because we know the next breach is going to be right there. The next big breach, massive breach that’s going to hit headlines is going to be all about how an internal AI model was utilized from a threat actor to leverage information internally, to extract as much
let’s say knowledge and crown jewels as possible from a client to leverage themselves. So we need to they’re utilizing that AI model to then extract information so we can notify on it. That’s what I think is, is sort of the next big thing. And when I spoke, ⁓ go ahead.
Marlene Gebauer (41:00)
So.
So like,
I was just thinking of a scenario, like a partner leaves for another firm and they sort of get that information, sort of extract that information from the AI model. that what you’re thinking?
Christian Geyer (41:21)
Well, no it’s a well, that could be one aspect where when a client leaves a law firm and moves to another, you’ve got to transition all the information that you have on that client. What’s being housed in that centralized AI that isn’t easily extracted to then just say, this is for XYZ client. Let me ship that to the new law firm. That is one use case, but it’s not traditionally ours. We’re looking at.
breaches and how the breach data can be exposed and processed. Where that goes into is one of my best example, one of my ex employers, they have 500 plus employees. They developed their own in-house AI to help all their researchers.
where they could drop their research papers. They could ask the AI questions about how other models or other research studies were going and get that information out to help their research studies. They’re a government contractor that does research studies for the Navy and Marine Corps. So they house countless data on projects that are impacting numerous, let’s say, military programs out
Imagine if I got that, I got access to that AI and I could ask it all abundant questions that I wanted. Maybe I had credentials of an elevated nature and I could make myself look like that, look like an elevated individual talking to that AI and I could ask it, tell me about every project that’s going on. Tell me about this Navy project. Tell me about the Marine Corps projects. I could then leverage that from a threat perspective.
Now, the other aspect where I was talking with the CISOs and CTOs over at LegalWeek, I was asking them, do you know that your AI is storing sensitive information in the backend and its libraries? They said, we know it. I said, do you want to, do you solve it for it? Do you realize that’s a threat? And they said, we’ve been told it’s a threat. This is an actual, I’m going to loosely quote it because I don’t have it on recording, but it was.
I know that it’s a threat. I’ve been told by my security team that it’s a threat, but I’m going to ignore them for as long as I need to until they come to me screaming to where I can’t ignore That is not the way we should be developing a technology that could be utilized as an insider threat or a tool that we’ve created that a a hacker.
could actually leverage against us.
I guess that’s where we’re focused is we’re not trying to protect it. We’re trying to understand how we could leverage those AI models to then extract the information once it’s been breached. That’s where we’re focusing our next R and D sort of push is to, it’s to tear apart an AI, expose the collectors, expose the actual agents that it’s using, and then understand the inputs that it’s receiving and where it’s storing.
Greg Lambert (44:28)
So that, I mean, I think you’ve answered my crystal ball question of three quarters of the way. So I’m going to narrowly focus have you pull out your crystal ball? Over the next couple of years, because of this specific threat that you see that is, that companies, law firms, insurance, and others are,
exposing themselves with the internal AI tools, which everyone is raving about and wanting to do. wondering if ACTFORE not just becomes an after the fact kind of company that comes in and cleans up the mess when it’s done. Do you see yourself positioning yourself almost in an advisor role kind of head off these potential breaches?
before they happen, what’s kind of the next advancement on the horizon for
Christian Geyer (45:24)
Yeah, so I’ve been engaged numerous times over the last two years to then come in from an advisory perspective. We’ve been asked to proactively scan an environment to help identify where a threat actor could come in, where they could actually leverage data. Right now, and I don’t want to say it’s self-serving, but where we’re focusing is on the breach response.
because there are countless breaches out there, although we’ve done 1300 deployments plus in two years, two and a half years. There’s so many more that we need to protect Americans data. We need to protect global data, individuals that falling victim to identity theft, financial risk. There’s so much more out there that we can do. What we’re trying to do is get our technology into every single breach that we possibly can to head off.
⁓ Now you could say if you went proactive, you could get there before the actual breach occurs. I’ve been in now three companies that were proactive to begin with, or that even tried a proactive tool. And unfortunately, wallets don’t open quite as frequently on a proactive nature than they do on a reactive. ⁓ So do I, in my crystal.
Greg Lambert (46:42)
Now law firms are the same way. Law
firms are the same way. we talked to her blue in the face on how they shouldn’t be doing something, but where we make the money is they don’t listen to us and they go ahead and do it anyway.
Marlene Gebauer (46:46)
It’s so true. It’s sad, but it’s so true.
Christian Geyer (46:55)
Yeah. During the congressional hearings in 2024 on that big breach, I mean, I was reaching out to Congress people saying, hey, I can advise on this. Do you need some assistance? We need to develop and increase our privacy policies. We need to boost up our data retention, our data encryption requirements. And I got pretty much zero response back from our Congress people. And it’s like…
Marlene Gebauer (47:19)
kids.
Greg Lambert (47:22)
Shocking, it’s just
shocking.
Christian Geyer (47:23)
If I
had my crystal ball, I don’t foresee that simply because the appetite of being proactive is not on everyone’s mind, unfortunately. Where I do see it, we’re being more and more from from foreign jurisdictions. I think in our European context, we’re far more ahead. They’re probably at least a decade ahead of US policy.
they’re reaching out to us from a proactive perspective. If that was a possibility of, let’s say, a proactive product offering, advisory offering, I think that’s probably where in two years it would probably happen. For the US, we’re probably talking two decades.
Marlene Gebauer (48:06)
Well, Christian Geyer, it has been so nice learning more about you, about ACTFORE, and you’ve given us a lot of food for thought. So thank you so much for taking the time to speak with us here on The Geek in Review.
Christian Geyer (48:21)
Appreciate it. Thank you. pleasure.
Marlene Gebauer (48:23)
And
of course, thanks to all of you, our listeners for taking the time to listen to the Geek in Review podcast. If you enjoy the show, share it with a colleague. We’d love to hear from you, so reach out to us on LinkedIn and Blue Sky.
Greg Lambert (48:35)
And Christian, we’ll make sure that we put information on the show notes, but if listeners want to learn more about ACTFORE or to connect with you directly, what’s the best place for them to go?
Christian Geyer (48:48)
For ACTFORE our website, actfore.com and from there you can fill out a form. It goes directly, honestly, to me. So I’m connected to every single form inbound. I like to reach out to everyone. You can also sort of contact me on LinkedIn if you have questions about our services or ⁓ maybe you’re requiring a breach response.
Marlene Gebauer (49:09)
And as always, the music you hear is from Jerry David DeCicca Thank you very much, Jerry.
Greg Lambert (49:14)
Thanks, Jerry. Talk to you later.
Marlene Gebauer (49:17)
Bye. Hey, everybody join my soccer team.
Greg Lambert (49:19)
I thought it was a volleyball team.
Marlene Gebauer (49:22)
Okay, join the volleyball team.
