Join industry leaders and AI gurus for a dynamic panel discussion on how to fundamentally think about and leverage AI is CS. Our expert panelists will share their experiences, strategies, and best practices for integrating AI into CS operations, helping you understand the core principles and potential pitfalls. Whether you’re just beginning your AI journey or looking to refine your approach, this session will provide valuable insights into leveraging AI to elevate your customer success efforts.
0:00
Well, thank you all so much for joining us for an amazing panel on the
0:03
fundamentals of AI and customer success.
0:05
I hope everyone had a good lunch. Yes, everyone gets some good food.
0:09
Pokebowl was amazing. Any other options? Was it curry good? Pasta good? I heard
0:12
all good things. So awesome. Pasta was good.
0:15
We'll hope you've all had a chance to eat, grab some more coffee, maybe get
0:19
some water.
0:19
But super excited to kick off our afternoon sessions. We have an amazing panel
0:23
of folks,
0:23
some of which are on our keynote stage, Ori and Amit this morning, as well as
0:27
Maria,
0:27
who are very excited to have with us as well. I'll let everyone do a quick
0:30
intro,
0:30
but just wanted to kind of kick things off. Two quick housekeeping things.
0:34
We have a slide-o polls for questions. So in the app, if you open this session
0:39
in your agenda,
0:39
you can leave your questions in the Q&A and polls feature. We'll have a ton of
0:43
time to ask
0:44
amazing questions of our panelists, so make sure you keep those coming
0:46
throughout the session.
0:47
And then we also have those breakout surveys after each session, so just make
0:50
sure you fill those in
0:51
for a chance to enter our raffle. I think those are my only housekeeping things
0:54
, so I'll go ahead and have a seat.
0:56
But super excited to kick off the panel. Just to introduce myself, my name is
0:59
Tori Jeffcoat.
1:00
I lead the marketing team at Gaincide around our CS and AI programs. I don't
1:04
know if anyone is
1:05
here for my first session of the day, but have been in this track all day
1:07
hearing such amazing
1:08
things about AI. Really excited to hear from all of our panelists here today. I
1:12
do have a quick ice
1:13
breaker question for everyone to answer as we introduce ourselves. So our ice
1:17
breaker is,
1:17
what is the most exciting or funny use of AI that you personally have had in
1:21
the last couple of
1:22
months? I can also answer this question myself. I have a three and a five year
1:25
old, so I've actually
1:26
used it to write bedtime stories with their name and based on their interests.
1:30
My three year old
1:31
loves dinosaurs, so we have a lot of dinosaur stories. But that's my own answer
1:34
to that question.
1:35
And with that, I'll hand it over to Ori. So I'll actually start with the ice
1:39
breaker. So it was
1:41
Halloween in the US recently, and I had to get on another call, and there was
1:44
another ice breaker.
1:46
So I needed a joke for Halloween, so the joke that ChatupT gave me was, why don
1:53
't skeletons
1:54
fight with each other? Does anybody know? Because they don't have the guts.
1:58
I'll worry, by the way, now as a VP of product at Gainsite, overseeing AI,
2:08
customer success in Staircase, and prior to that, I was the co-founder and CEO
2:13
of Staircase.
2:16
So my story with AI, my father just celebrated his 70th, so we took a bunch of
2:23
old photos from
2:23
the 50s from when he was a baby and a toddler, and we animated and revived him.
2:28
That was a lot of fun.
2:29
Actually brought tears to his eyes, so that was really cool. And my name is
2:34
Amit. I am the global
2:36
VP of healthcare customer experience with Clarity. It's a very long title
2:40
because we're a verticalized
2:42
company. We do cybersecurity for critical infrastructure, and I run a specific
2:46
sector,
2:47
which is healthcare. So this is why so long. Awesome, nice to meet you.
2:52
Thank you. So my name is Maria Bandarenko, and I'm working for SAP Signavo, and
2:58
I'm doing
3:00
mostly operations and bringing all the data dreams come true. And my funny
3:04
story, I want to say
3:06
it is funny. First of all, I'm using AI a lot for my cooking ideas. When I'm
3:10
out of ideas, I'm just
3:12
asking, I'm asking, "Chadgipiti, okay, can you just come up with something?"
3:17
But yeah, the corporate
3:20
funny story is that I found the AI that is actually coming up with the songs as
3:27
well as with the music,
3:29
and I just fed it with some names of our teammates and then fed a few phrases,
3:35
and then it came up
3:36
with a very nice song, and then I also posted it on the channel, and they were
3:40
very surprised
3:40
with my hidden talents, but of course, I thought that is actually not me. I'm
3:44
not that talented
3:46
yet, but it was quite funny. Awesome. Well, it's so great to meet you all and
3:51
have you as part of
3:51
this panel. I'm going to kick us off with a question around building and
3:55
choosing the right AI for
3:56
post sales. Everyone on this panel is either building AI in your case, or
3:59
having to choose
4:00
and navigate AI for CS teams. Curious when you're thinking about AI, how do you
4:05
prioritize what
4:06
are the most important critical ways to use AI to get the most value within
4:10
customer success?
4:11
We can just go through the panel, or if you want to kick us off. Yeah, so I
4:16
think a couple of things.
4:18
One is when we talk about AI, I think everyone today goes straight to Chadgip
4:22
iti and LLMs, but
4:23
there's a lot of other AI out there from various kinds of machine learning that
4:28
are more statistical
4:29
oriented or things like that. I think it's really important, like anything,
4:35
that you're going to
4:36
do in your business is first of all to choose the problem that you want to
4:38
solve. It's just,
4:40
there's no point to just spend time. A lot of times you can waste time in AI
4:45
also, going down
4:47
what I call rapid holes in data science, and we can talk about that. It's
4:50
important to choose a
4:51
problem that you think you can, one is important to you, two is that there is
4:56
some technical
4:58
feasibility around it that you can see early results soon. I give you an
5:03
example from staircase.
5:04
We had a rule in staircase where we would try a data science feature, and if we
5:10
didn't get results
5:10
in two weeks, we would drop it. So it was, and again, if it was a few more days
5:16
here and there,
5:17
but the reason behind it was because very often we found that we could easily
5:22
spend months on a
5:22
project and not get anywhere. So I think it's really important to kind of
5:26
balance those two
5:27
criteria. So to me, it's about two things. One is increasing operational
5:34
efficiency. That's one
5:35
way of showing ROI, and the other one is like direct impact on revenue and
5:41
finding churn,
5:42
because I am in CS after all. So increasing operational efficiency, trying to
5:47
get any
5:48
manual labor, repeatable repetitive manual labor out the door as much as I can,
5:52
liberate my CSMs, if you will, note taking, inputting data into CRMs and
5:58
systems. Nobody likes that
6:00
stuff, and it's a time sucking machine. So this is the first thing, and
6:05
staircase is not the only
6:06
tool we're using in AI. We have a how-to video creation tool based on AI. We
6:14
have like an internal
6:15
AI knowledge engine inside the company that we use for CSMs and other employees
6:21
to ask questions
6:22
about the company, about our customers, whatever, natural language questions.
6:25
So really keen on
6:27
achieving this operational efficiency. This internal, so it saves headcount and
6:32
so on and so forth.
6:33
And the other thing, obviously staircase in that context is actually insights
6:39
that impact
6:41
revenue and churn. And I just spoke about it earlier today on stage. Some of
6:48
these insights are
6:49
really key in saving accounts before you lose them. So these are the two main
6:54
areas for me.
6:56
Thanks for sharing. We have quite similar concepts. So especially in operations
7:01
, we first evaluate
7:04
if AI would bring impact, valuable impact for our customer success. And if we
7:10
see that it would
7:11
reduce manual work, it would automate certain things, then we go for it. That's
7:18
a very important
7:19
thing, as you already mentioned, that you need to fail very fast. And our
7:23
directors are also
7:24
pushing a lot to fail very fast. And then if it's not working, then just go for
7:29
that. And that's why
7:30
we are adopting this approach and also doing a similar concept there.
7:35
So it sounds like fail fast and it's not working and it has to drive actual
7:39
results.
7:40
Better succeed fast, but if you can't.
7:42
But it sounds like getting ROI is kind of the real value of AI for all of you
7:50
when you're
7:50
considering choosing tools. Curious when you're thinking about what those
7:53
results are. If you
7:54
could share some examples of how AI has impacted your own teams. And Maria,
7:57
maybe we start your
7:58
end and come back this way this time. Sure. So one of the recent examples that
8:03
we are leveraging is
8:05
the cheat sheet that is also done by Gainsite, where we have the crisp overview
8:11
of the customer
8:12
profile and where the CSMs as well as directors can speed up their routine by
8:18
saving the time.
8:19
And then they can also see the whole overview, the whole profile of the
8:24
customer in seconds.
8:26
They can be prepared for the call in a minute. Just by reading the whole
8:31
outcome. What are the
8:32
key projects? What are the key risks? Is there any renewal discussion going on?
8:37
And then with all
8:38
this information, they are already prepared for the call. And okay, it's
8:43
already done. And then
8:44
they can finally start focusing on the strategic part of the conversation so
8:49
that they can prepare
8:50
more for this strategy and be in the real partner for the customer so that they
8:57
can
8:57
work on the success together with the customer. So we have a very similar
9:05
concept with their case.
9:06
So you get this cheat sheet before a meeting, it saves a lot of time. But I've
9:11
had issues with
9:12
CSMs that were a little bit overloaded. And so we used other functionalities
9:18
like
9:19
reports on open items at the end of the week, stuff that really helped them
9:22
make sure they don't
9:23
drop anything. And also we have just like last week used the efficiency report
9:30
and efficiency report to actually work on our headcount growth plan. So we compared
9:37
how much effort and
9:39
energy were actually investing in certain customers of certain tiers against
9:45
what they should,
9:45
what kind of investments we should be making in these customers according to
9:50
their tier.
9:51
So definitely use that as well to show ROI like actual financial discussion
9:57
with the CFO.
9:59
And then lastly, I mean, even if you say one customer thanks to an insight that
10:06
came in on time
10:07
that's already the best ROI ever, right? So it only takes one customer to prove
10:12
it.
10:12
Awesome. Or obviously working against it, but curious if you have any good
10:17
results from
10:17
customers of staircase or key use cases you want to share here too.
10:20
Yeah, so I think when it comes to ROI, I look at it like from two different
10:25
aspects. One is
10:26
cost savings, which I think we talk about, which is actually probably easier to
10:30
measure and to,
10:32
if you need to build a business case internally is to do that one. The second
10:37
is obviously
10:37
revenue. I mean, to like saving a customer or being proactive around a downsell
10:44
or something
10:45
like that or selling a customer. The problem there is what we call the
10:49
attribution problem.
10:50
It's very hard to kind of narrow down to one insight or one thing. So it's more
10:55
of an overall
10:57
view of what happened. I think a bunch of things that we see in the field. So
11:03
one is we see incidents
11:04
where an alert came in time. So an example could be one of our customers. The
11:11
CSM was on PTO two
11:14
weeks before renewal. And a very upset email came in on a Friday from a very
11:22
important customer
11:23
and that customer. And basically we caught that extremely negative sentiment
11:30
and sent that email
11:31
to the chief customer officer who then pulled in the VP of engineering and both
11:37
of them jumped on a
11:38
call and they saved the customer. Now you could say, hey, you know what that
11:43
didn't save the customer,
11:44
but definitely it had a very material effect on the overall outcome. So that's
11:49
an example on the
11:49
revenue side. I think on the efficiency side, we had another customer who had
11:55
five tiers or
11:56
segments for customers. And when you build segmentation, you're not going to
12:01
apply the
12:02
same amount of time per customer. That's the whole idea behind segmentation.
12:06
And when we did an
12:06
analysis in Stakecase, we actually found out that two segments had the same
12:10
amount of time per
12:11
customer. So realistically, they weren't two segments. So out of that analysis,
12:18
we recommended
12:18
that they either merge the two segments or change something fundamentally and
12:22
how they're serving.
12:24
So that was another insight. The third, we also use it internally. And when I
12:29
started kind of
12:30
looking at the product team, I'm a big believer in in market product. That
12:34
means interacting with
12:35
customers. And we did an analysis on ourselves, like how much time we're
12:39
spending on meetings with
12:40
customers versus internal meetings. And we found out that we need to kind of in
12:45
certain areas,
12:46
we need to kind of push it up a little bit by 10 or 15%. So there's another use
12:51
case.
12:52
Awesome. That sounds like great ROI and great use cases as well for AI.
12:56
Despite the fact that we are starting to see that ROI and see those results, I
13:00
think there's
13:00
often a lot of barriers and fears that still exist when it comes to AI. One of
13:04
them being
13:05
replacing kind of that human element and with digital and automation kind of
13:08
taking so much
13:09
of that off of our plate. Curious to hear from all of our panelists how you
13:12
think about AI in terms of
13:13
coexisting with the human side and how that AI and automation can complement
13:18
instead of replace
13:19
some of those human pieces. And Ori, maybe we can start with you on this one.
13:22
So I think we all have a little bit of fear AI. I mean, I think it's justified.
13:30
From like,
13:31
hey, the robots are going to take over, are we going to have jobs in 10 years,
13:34
things like that.
13:35
But the more I spend time in it, I actually realize or think that it's really
13:39
far away.
13:40
I think we all play around with chat GPT and see that, okay, it can do so much,
13:45
but it also
13:46
can't really replace a person. But I think it can make our life easier. And I
13:52
think that there
13:52
are certain areas where I believe in kind of the 80/20. So a lot of times AI
13:57
will do 80% of that work,
13:59
but you have to do the fine touch-ups to something you're writing or something
14:04
that you summarize or
14:05
it could be that you have a task and otherwise it would take you two hours. So
14:11
I'll give you an
14:11
example. We came back from an off site in India and I had a whole bunch of
14:15
thoughts in my head
14:16
right and I didn't really feel like organizing them on a piece of paper. So I
14:19
just wrote them
14:20
kind of a stream of consciousness and then I fade it. We have our own instance
14:23
of chat GPT,
14:24
so it's secure and all that. We fed it in there and asked, can you arrange this
14:27
into kind of a
14:28
cohesive structured product thing? And it was great, right? But I still had to
14:32
kind of, so again,
14:33
so I think there's a distance, there's time before that. I think that the area
14:40
is the scaled side,
14:41
which I think is not a surprise. I think scales where we can use AI and use it
14:47
to personalize.
14:48
Today's scale is not personalized and I think that the opportunity is there.
14:51
So honestly, no fear on our side, we're pretty much AI junkies in my company
15:00
and especially my team.
15:01
If anything is building, it's building the trust. So before you start using it,
15:09
people are
15:10
a little apprehensive. I mean, is this trustworthy? Is this data usable? I mean
15:15
, and there's a lot of,
15:16
you know, your AI results are only going to be as good as your data. So there's
15:22
got to be a lot
15:23
of infrastructure work to make sure that you've rolled it out. People can
15:27
actually trust it because
15:28
there's, you know, you know what they say, there's no second chance for a first
15:31
impression. So if
15:32
anything, that was not issue because, you know, we were prepared for it, but
15:37
that was the area that
15:38
we kind of were worried about, but not really about replacing people, our
15:41
customers,
15:42
for better than for worse, they love to meet us, they love QBRs and that's not
15:47
going to change.
15:47
We're not going to have AI run these things. If anything, they're going to have
15:51
more time for
15:51
these meetings. I would agree. In overall, I would not be hearing from the,
15:59
that's AI would replace
16:00
us all or so. I would look at this situation from a different perspective by
16:06
using AI so that
16:08
it amplifies all the strengths that we have. And for example, especially for
16:15
high-touch customers,
16:16
where we have direct communication and collaboration with the customer. And I
16:21
think that AI and also
16:25
a customer success should work closely together. So that, for example, AI is
16:29
doing the data heavy
16:31
lifting while customer success managers are doing the strategic, empathetic,
16:39
collaborative job with
16:41
the customers. And then with this combination, like it's like a perfect
16:45
marriage of two sides,
16:48
and then with this combination, we can really benefit out of it. So I would
16:53
never think that it's
16:54
actually would replace some of the customer success jobs there.
17:00
It's good to hear we have confidence. We'll still have jobs. That's good.
17:03
Curious though,
17:05
as we've kind of been implementing AI, we feel confident in it overall, it
17:09
sounds like.
17:09
Doesn't mean there aren't Roblox and hurdles and maybe challenges to getting
17:13
either
17:14
it adopted effectively or delivered it in the right ways for customer success.
17:17
So curious from maybe a challenge is Roblox hiccups or learnings that everyone
17:22
has had in
17:23
leveraging or rolling out AI within your business. Curious if you could share a
17:26
little bit about
17:27
what any of those Roblox has been. And Marie, maybe we start on here at this
17:30
time.
17:30
Sure. So I can say from the corporate side, that was quite challenging to get
17:37
the approval from
17:38
all the departments to get started. Even though we know that all the AI things
17:44
are GDPR compliance
17:45
to you with all the concepts that SAP has the strict legacy, it's very, it was
17:51
quite challenging
17:53
to get the approval, but then afterwards it was done and finally everyone is
17:57
benefiting out of it.
17:59
So that was the first challenge. So data protection, data privacy. Second
18:03
challenge that is still going
18:05
on and I think that we all need to work on is the adoption of those AI features
18:10
because
18:10
it is always nice to have those, but if no one is using that or if no one is
18:17
actually benefiting
18:18
out of it, then it doesn't bring any value. So we all need to push our CSMs to
18:25
use those
18:25
features. So then they see the value itself and then with these sailing points,
18:33
then they can also
18:34
spread the word of mouth and then also share it with all the other CSM that,
18:39
okay, I'm using
18:40
this cheat sheet because it saves me so much time and so on. So those are the
18:43
two challenges
18:45
we are having right now. No, the first one is done. Just the adoption. So
18:51
adoption is, I think that
18:51
would be always the quite a hard topic. So for us, I mean, it's pretty much
18:57
what I said earlier,
18:59
it was really setting up the data right and making sure, you know, we don't
19:06
cross paths
19:07
between data points that shouldn't be crossing paths. So data attribution is a
19:12
problem in many
19:13
different AI tools. And as I said earlier, the result is only going to be as
19:18
good as your data.
19:19
So that was one word block and we're so actively working on it even myself.
19:23
The other thing from a privacy standpoint, to your point Maria,
19:30
for this to be effective, we need people to open up digitally, right? Because
19:38
it's all about
19:39
getting that cumulative insight of all of our interactions with the customer.
19:44
So I've had some
19:46
issues with that. And you know, I even had one of our C levels asking me, I
19:50
want you to remove me
19:51
from that list because, you know, some of the stuff I'm discussing with
19:55
customers is I
19:55
sensitive, I don't want to, you know, I don't want to see myself in
19:58
notifications. I don't want
19:59
to see other people, you know, reading my emails basically. So that was, but I
20:05
can safely say it was,
20:07
you know, very few instances for the most part, you know, people are fine with
20:10
it and they understand
20:11
the benefit from it. So, yeah, these are the two things. Privacy a little bit
20:16
and setting up the
20:17
data, making sure it's reliable. >> Yeah. So again, I think to echo, Maria said
20:24
, I think
20:25
I see a lot of organizations challenging the legal privacy. And I see it's kind
20:31
of like a battle
20:32
between the buyer, the business, you know, folks like yourselves who want to
20:36
use AI because you
20:37
have a, you know, need for it. And then there's legal and privacy that their
20:42
job is really to
20:43
kind of secure the organization. I'm not even talking to info, I mean, that's
20:46
another topic. But
20:47
even if you have everything, GDPR and CCPA and, you know, data centers and all
20:51
that stuff.
20:52
So it really is kind of a, it's like almost like a little bit of an internal
20:58
battle where,
20:59
you know, you have to drive innovation because you're under pressure to drive
21:03
business results,
21:04
right? I mean, you're being asked to improve GRR, improve NRR without having a
21:10
bigger team,
21:11
right? So you need technology, but then when you bring technology, somebody
21:15
says, well,
21:15
that's not secure. So I think that that's kind of a little bit of the friction
21:19
that we're seeing
21:20
in the field. But we're seeing at the end of the day, more and more ways to
21:24
overcome this,
21:25
either through different levels, right? So you can, for example, on meets
21:28
example,
21:29
you can exclude certain people or you can mask certain things. So at the end of
21:33
the day,
21:33
you have to find kind of a harmony inside the organization. I think that's
21:36
really the key
21:37
because it also, it's a directive, right? If you're not going to do it, you're
21:42
kind of
21:43
going to stay behind if to use that term. Yeah. Great. Just to add to that, I
21:48
think, I mean,
21:49
to your point, there are compromises that could be made. For example, those,
21:53
you know,
21:53
notifications with sensitive information are only open to a very specific group
21:58
of people
21:58
that companies, I think everyone at the company can see these things, but it's
22:02
the people that
22:03
can actually make an impact, usually leaders. And obviously, if you were a CSM
22:08
and it's your
22:09
account, certain alerts you'd be seeing, but there are ways to do it safely and
22:14
securely,
22:15
while also, you know, getting the benefit out of it. Just to share with
22:20
everyone.
22:21
Yeah. It's awesome. So it sounds like privacy, security concerns, adoption
22:25
concerns,
22:26
I think you mentioned as well, Maria. Are there any kind of ways that you're
22:30
seeing to kind of
22:30
get over those challenges or anything you're seeing work really well to
22:33
overcome some of those
22:34
common barriers in NCS, maybe based on your experience with customers, what you
22:38
've kind of seen?
22:39
Yeah. So I think like anything nowadays, anything you're going to buy, right?
22:42
You need a business
22:43
case. And, you know, there's a framework. You know, we have a framework in
22:48
place, but I think,
22:49
you know, happy to share kind of the, I think at the end of the day, it's an RO
22:53
I question. If
22:53
to use that term, even though it's not pure ROI, but still again, you're going
22:57
to come and say,
22:58
if I implement this, I can do so-and-so cost saving, so hours, dollars, and so
23:03
on. And I can
23:04
have an impact on revenue that might take a little bit longer to show and prove
23:08
, right? Because
23:09
if you're talking about annual renewals, you know, you'll only see that impact
23:12
over a certain
23:13
period of time or things like that. And some of it's going to be quantitative.
23:16
You could actually
23:16
show, you know, from a back of the envelope calculation to an actual
23:20
statistical, you know, analysis.
23:22
And some of it's going to be more qualitative, like based on stories that we
23:25
talked about.
23:26
So I think that those are, you know, key areas that, you know, you need to be
23:31
prepared in this,
23:32
you know, day and age. Even smaller organizations are today being asked by CFOs
23:38
and CEOs and so on
23:39
to justify a purchase of a technology. So, yeah, I think that's one area that I
23:45
think,
23:46
you know, and we have experience on that and happy to share, you know, around
23:50
that.
23:50
And you mentioned some of those levers. I mean, right, taking CS, sorry, C,
23:55
sweet executives, CS, automatic, I'm in there. I'm not taking them out kind of
24:00
of some of those
24:00
conditions. Any other, you know, to to you to recommend to kind of overcome
24:04
some of those barriers?
24:06
So first of all, you know, once to be perfectly honest, once some of our
24:10
business leaders saw
24:11
the outcomes of this, they were just astonished, they couldn't get enough of it
24:15
. So sometimes it's
24:15
just, you know, put them in front of the data, in front of the insights and you
24:18
blow them away.
24:19
And then they're willing to pay the price, you know, both financially and from
24:23
a privacy
24:23
perspective, so to speak. Otherwise, it's really about being very strict with
24:30
what you share with
24:31
whom and we're very strict about it. Because it is, you know, a little bit
24:36
intrusive, I think in a
24:38
fair, balanced way, in my opinion, that's why I'm using it. But we do want to
24:43
respect, you know,
24:44
people's privacy and so on and so forth. So to be very strict about it, I've
24:48
had, you know, people
24:49
add, we have this Slack modifications group for these extremely negative
24:53
messages and other stuff.
24:55
We've had people add, you know, people that were not permitted to this, like
25:00
immediately,
25:00
you know, you remove them and you have to please don't add anyone without my
25:04
permission. There's a
25:05
lot of, you know, sensitive information running in this channel. Please be
25:09
aware of that. But
25:12
yeah, other than that, I mean, the value, the value to the business was so
25:19
great that other
25:20
people were willing to make that compromise. On our challenge that we have an
25:25
adoption, we usually
25:27
go through the both ways. So bottom up and top down approach. For the bottom up
25:34
, we usually try to
25:35
find the advocates of the features or of certain things. And they are usually
25:43
also those CSMs.
25:44
They're actually then either leading some sessions where they share their
25:50
knowledge or they are also
25:52
sharing how they're using Gainside, for example, on a daily basis. And then
25:56
with that, they are also
25:58
doing some kind of advocating on those AI features that we are using right now.
26:04
And talking about from the point of view of the top down approach, we usually,
26:10
so are, for example,
26:11
directors, customer success directors, they're also collaborating a lot with
26:15
the other lines of
26:16
businesses. And they also see that other lines of businesses are using AI. And
26:20
it was the recent
26:22
example. Yeah, that's one of the directors was just coming over and saying that
26:26
, hey,
26:27
those guys, they have in their Gainside instance cheat sheet, and we do not
26:32
have it. Why and how
26:33
come? Can we implement it? And then, yeah, we were just about to open it to all
26:41
the audience of
26:43
Gainside and then we were just like, okay, you need just need to be patient.
26:46
Why wait a sec?
26:47
But in overall, once you have this patient from also from the directors, and
26:52
you see that they
26:53
are going to be using it, then everyone is have this kind of a spark within
26:58
themselves. And then
26:59
they can also leverage these features. Awesome. Well, we have so many good
27:05
questions. I'm seeing
27:06
scrolling on the slide over here. So I want to make sure we have some time to
27:09
get to those. I'm
27:09
going to ask one more question of our panelists first. Thinking ahead and
27:13
looking to the future
27:14
of AI, what are you most excited about for AI to bring to the table for
27:17
customer success?
27:20
Sure. So, from my perspective, I wish I would have more data consistency, and I
27:31
wish that AI
27:32
would be able to bring this data into the right buckets and then can process it
27:39
more effectively.
27:41
But if we're talking more realistically, I would love to see more inputs on the
27:49
cross-sell and upsell opportunities. Yes, we do have a lot of call to actions.
27:54
We do have a lot
27:54
of automations behind that recognize that. Okay, there isn't a potential upsell
28:00
or cross-sell,
28:01
but still, I wish we would have more of that. I wish we would have more
28:06
proactive approach rather
28:07
than reactive, as it's quite important to be very sensible to all the changes
28:14
the customer has.
28:15
And then I really hope that with the features that Gainside and Stakey's have,
28:23
I wish that we can also implement it as fast as possible to them can also
28:30
benefit out of it.
28:33
I think I said my piece on stage earlier today, but to me, it's all about
28:38
taking the digital
28:39
journey to the next level. So, if you will, introduce a Netflix or a Spotify-
28:44
like experience,
28:45
the customers in B2B. So, it's not just about learning and creating the
28:50
visibility,
28:51
but also proactively approaching the customer's in-app. I mean, if you did that
28:56
, you might also
28:57
be interested in that. Oh, I see that, according to the role, you should
29:00
probably be interested in
29:01
that this is some of the best things other people in your category have done
29:04
with this feature.
29:05
And then, you know, you know, fusing all these data points together about the
29:12
health of the
29:12
customer, about usage drops, about sentiment and topics that come up on calls
29:18
to create
29:19
that proactive messaging, going back to the customer to put them back on track,
29:23
and to make sure they actually, you know, are doing the right things to meet
29:30
their KPIs,
29:31
and their goals, which they set when they bought the tool, right? They had to
29:34
make a business case
29:35
to their leadership to get a product. So, let's remember what these things were
29:40
, and let AI
29:41
kind of help you navigate towards those goals. This is a very, you know, kind
29:45
of a, I don't want
29:47
to say far-fetched because I think we're going to get through it a few years,
29:49
but that's what I,
29:51
that's what I see AI doing in the future. Yeah, so I think the agentic approach
29:58
, which I think a
29:58
lot of folks kind of have been reading about, yeah, I'm 100% on board of that,
30:02
and happy to share
30:04
more thoughts on that. I actually think to take one step back, I think a lot of
30:07
organizations
30:08
know what needs to be done, right? You kind of have a real good understanding
30:11
of your business,
30:12
and you have like really good processes in place, but half of the time we just
30:16
don't know if they're
30:17
actually fully, you know, occurring, you know, happening. And we do like a
30:21
churn analysis or a
30:22
win-lawness analysis. I would say 80% of the time we found that we find out
30:26
that some stage just
30:27
didn't happen, right? Or something that I'm not talking about those extreme,
30:31
you know, out-of-the-box
30:32
situations, right? So I think one promise of AI is to kind of do this constant,
30:37
I would say like
30:38
almost like a real-time audit of what's happening. Like is the organization
30:42
working the way it should
30:43
be? You know, I kind of, if I think about a car, right, there's all these
30:45
computers now in cars,
30:47
and they kind of, every time you know your tire pressure is out, or this is out
30:49
, you know,
30:50
they kind of alert you and they let you know. So, and there's a reason, right?
30:53
Because somebody
30:53
said that the tire pressure needs to be, I don't know, whatever it is, 40 psi,
30:57
right? So the same
30:58
thing in your organization, you have a list of things that need to happen, you
31:01
know, you guys are
31:02
experts, but you just don't know half of the time if they're happening, if
31:05
people, you know, are
31:06
following through or if they have other challenges. So I'd say that is, you
31:10
know, one core area. The
31:13
second is looking, you know, today it's very much like a question and answer. I
31:17
write a question,
31:18
you know, in LLMs, and I get an answer, but wouldn't it be nice if, you know,
31:23
they kind of anticipated
31:24
my question, right? If the data already came and showed me that, you know, that
31:28
there's, you know,
31:29
a skew in the efficiency, then maybe it starts to kind of do the second or
31:33
third level of quaring
31:35
for me. So it saves me time. So there's that aspect of that, and we are working
31:39
on those things.
31:40
It connects to an agentic flow where the idea here is that different parts of
31:44
the platform are
31:45
asking questions and talking to each other to kind of almost mimic like how a
31:49
person thinks.
31:50
But I would say that those are, I think, things that are not, you know, we're
31:53
not talking about
31:54
robots in the future. These are things that could happen very soon and could
31:57
really drastically
31:58
change the operational view of an organization. Awesome. Well, I'm very excited
32:04
for the future
32:05
with AI. All of these great use cases included, but I think there's so much we
32:08
can anticipate from
32:09
it. I do know we have a couple of really good questions that have come in on
32:12
our slide of if
32:13
we can throw those up, perhaps, and maybe go through some of those with the
32:15
panel on the time we have
32:16
left. First question, Maria, I'll ask you this one. If you could do one thing
32:22
with AI with your
32:23
CS team and you don't have staircase, what would it be? So I would say we do
32:31
not use take as yet,
32:32
but we do have some features within Gainside. And I really, I'm a big fan of
32:39
the AMP-S analysis
32:40
and also AMP-S, yeah, customer satisfaction. So then Gainside has a very nice
32:50
feature that
32:50
allows you to bucket all the customers and to see if they are really in-fence
32:57
or if they are
32:58
the ones that need support and then based on, like, based on this revenue
33:03
metrics, you can also
33:04
identify what kind of customers you would like to focus on. And as well as if I
33:09
'm coming back to
33:10
the AMP-S, it also gives such a nice opportunity to see what kind of things and
33:15
what kind of features
33:16
or main things customers are struggling with and what do they appreciate. So
33:23
then you can focus more
33:25
on the conversation and on leading the collaboration with the customer on that.
33:31
So yeah, the AMP-S analysis and all the analytical stuff is my passion there
33:39
and I really like that.
33:40
This next one, Amit, I'll ask you this question. Since it sounds like you use
33:46
AI pretty heavily
33:47
internally, but what are some of your top tips to prepare CS teams to use AI
33:51
most effectively?
33:52
So first of all, you need to get them hyped on AI and show them, you know, how
33:57
much work you're
33:57
going to save them if they adopt AI and this is the best seller, right? And as
34:02
long as there's
34:03
the wheel, there's a way, right? That's what they always say. So this is the
34:05
first thing.
34:06
Sell it to your teams. And then after that, you know, I heard it on the other
34:12
session that we
34:13
were listening in on. AI is not a silver bullet yet. Maybe at some point it
34:17
will be. So, you know,
34:20
CSMs needs to understand that this is going to enhance and augment them. It's
34:26
not going to replace
34:27
them. And so, you know, they have to be mindful. Again, I'm repeating it like a
34:31
broken record,
34:32
but it's only going to be as good as their data. So if they don't keep data
34:36
hygiene,
34:36
unfortunately, I mean, there's still going to be data hygiene tasks, then
34:40
things could go sideways.
34:42
So these are the two things like understand what you're getting, but also what
34:45
you have to give
34:46
in return for this to be effective and to keep serving.
34:49
Awesome. This next one, oh, the question's changed. Okay, there we go.
34:56
Collecting all the data on customer interactions who are in your organization,
34:59
who they had contact with, is GDPR like a topic with customers that comes up?
35:09
Or maybe you could answer this one. It's always. I would say there's never a
35:14
case where it isn't.
35:15
And it doesn't matter if it's, you know, staircase or a gain site or whoever
35:19
you're talking to,
35:21
you should definitely make sure that, you know, you have first of all those
35:24
credentials, right?
35:25
You know, from SOC to GDPR, and if you're selling in the US, CCPA and all that
35:29
stuff.
35:30
But usually that's kind of the base level. By the way, there is, it really
35:36
depends on the
35:38
organization's perspective. From a GDPR perspective, there's no problem with
35:44
that analysis.
35:45
It's, you know, a lot of times people think GDPR is one thing, but really, it's
35:51
not. It doesn't
35:52
cover this case, by the way. I think there are other cases. For example, in
35:55
Germany, there's more
35:56
sensitivity to looking at a certain employee level and maybe in France too. And
36:01
then, you know,
36:02
and if you want to, you know, by the way, GDPR doesn't necessarily force you to
36:07
host data in EU,
36:08
but most companies want it there. So there's certain, you know, overall, I
36:13
would say, you know,
36:15
not restrictions, but, you know, organizational philosophy around data and
36:19
privacy. Remember
36:22
that there is a trade-off, be, you know, you need an insight. For an insight,
36:26
you need data,
36:27
right? So at the end of the day, we can't create an insight out of the air. We
36:31
do have to be very
36:32
conscious about privacy, security, and all that. But there has to be a trade-
36:37
off between the two.
36:38
And once again, we are, in like many organizations, we are GDPR compliant and,
36:45
you know, can still
36:47
do the analysis of interactions. Awesome. And there's, I think, another follow-
36:51
up
36:51
staircase question is, I'll ask you as well, Ori. I'm going to jump down to, it
36:55
sounds like
36:55
staircase is very reactive, providing info after the negative email, after the
36:59
negative sentiment.
37:01
How can it be used to be more proactive and prevent that email from ever
37:04
happening?
37:05
That's a great question. So when we started Staircase, we did a lot of, like,
37:08
on the predictions
37:09
side of things. And I always give this example of the weather, right? Like, how
37:15
much in the
37:15
future can we predict weather, typically, at week to 10 days? So I'd be very
37:20
careful of anyone
37:22
who's promising you predictions way into the future, especially in the world
37:27
that we live
37:27
in customer success, where there are so many things that are changing. And
37:31
therefore, you know,
37:34
a predictive capability way into the future is very challenging. Where you can
37:40
use Staircase in
37:41
a proactive approach, many times, yes, there is a negative email, but it might
37:46
have been the third
37:47
or the fourth, right? So wouldn't it be nice to see the first one, right? So we
37:52
've done many
37:53
analysis in retrospect, historical data, and you find that, just like in the
37:57
demo that I did on
37:58
stage, you find that churn occurred, and you actually can see it on a timeline,
38:01
and you'll see there was
38:02
an extremely negative, and then two weeks before that was another extremely
38:05
negative,
38:06
and two weeks before that was another extremely negative. So yeah, the third
38:08
extremely negative
38:09
doesn't really help anybody, right? Because by that time, the customer is gone.
38:13
But after the
38:14
first one, if it was escalated correctly in the organization, if it was dealt
38:18
with, and so on and
38:19
so forth, you could have, you know, been proactive around that. So it's almost
38:24
like zero reaction time
38:26
versus prediction. I think this next one's also a technical question. So sorry,
38:31
I'm going to send
38:32
it your way. How can we combine internal AI engines with Gainsite AI?
38:36
Yeah, so there are a couple of ways to do it. One is by sharing the output of,
38:48
if you have an AI
38:49
engine that's doing something today that you think is useful, you can basically
38:53
share the output of
38:54
that data, which can be fed into Gainsite or into Staircase, and then can be
39:00
used, for example,
39:02
like, for example, I'll give an example. So let's say you're doing some kind of
39:05
analysis on usage
39:06
data through an AI engine, or it's part of your health score, something like
39:11
that. You can then take
39:12
that output and, you know, put it, and feed it into Gainsite and use that in
39:17
another more robust
39:18
health score. That's just one example. We don't, at this moment, allow kind of
39:23
engines to interact
39:24
with each other, and I'm not really sure, you know, how we would do that or why
39:27
we would do that.
39:29
But if there's a specific use case, then I'm happy to talk offline after this.
39:34
And we don't have,
39:35
like, a connectivity or shared data, that means with engines for a lot of
39:39
reasons that we talked
39:41
about. But I would say that usually there's a pipeline in machine learning, so
39:44
there's an output
39:45
of one model that feeds into another model that feeds into another. So that
39:50
could be the way to do
39:51
that through the data layer. Awesome. There was a, I'm actually going to skip
39:56
down to the how-to
39:56
video question. So this one's for you and me. You mentioned using a how-to
40:00
video creation tool.
40:01
Do you want to share a little bit more about that? Of course. It's a great tool
40:03
. It's called
40:04
Guide G-U-I-D-D-E. And it basically tracks your key strokes and mouse clicks
40:11
within your app,
40:13
and creates a video or, like, a slide deck with automatic narration, like
40:19
describing what you've
40:20
done. And then you can also edit the text and, you know, do whatever you want
40:25
with it. So you
40:26
create videos in seconds for customers. And also talk about ROI. We actually
40:30
use it for our academy.
40:32
So instead of using these very specific enablement platforms, we just create
40:36
these videos and then
40:38
push them in app. So yeah, it's called Guide and it's a great tool. Awesome.
40:42
Sounds like a good one.
40:44
I think I'm going to ask you one more question at me, and then we'll get you
40:46
Maria for the next one.
40:47
But I'm going to skip down to you recommending, where do you recommend to start
40:50
when you're cleaning
40:51
your data to prepare for AI? You've mentioned that a couple times. That's a
40:54
great, that's a great
40:55
question. I'm sure Ory knows what I'm going to say. But make sure that if you
41:02
're selling your product
41:04
through a partner ecosystem, and many of us do, especially in B2B, you have to
41:09
be very careful
41:10
about it, because that could mix data and cross-reference data in a way that it
41:14
shouldn't do.
41:16
So that's the first thing I would do. And that's the first thing we do in the
41:18
circuit. We basically,
41:20
you know, removed all these kind of joint domains or joint partners. And it
41:28
proved itself, because
41:30
anyway, we mostly wanted to cover the customer interaction. So as long as the
41:35
customer was
41:36
participating in an interaction, we would get the data, it would feed the
41:39
engine, we would get the
41:40
alerts. So that's the first thing I would do. And the other thing is your
41:44
contact data or people
41:46
data. This is a tough one, because it's a lot of work. In the industry I come
41:51
from, there's
41:52
a huge turnover. So it's a lot of work updating the stakeholders, who's the
41:57
champion, who's the
41:58
executive sponsor, so on and so forth. But it does pay off. So that's the two
42:02
things. Partner
42:03
ecosystem and contact people or just users. >> Awesome. Maria, I'll ask you
42:10
this next one.
42:14
For the second question here, when you're thinking about using AI for meeting
42:18
summaries, I mentioned
42:18
using cheat sheet as well, right, to summarize accounts. Do you typically find
42:22
that the AI gets
42:23
things wrong? Is it typically right? How do you make sure it is effective for
42:27
you and your needs?
42:28
>> So the most important thing is that it also mentioned already by everyone
42:34
here,
42:34
is that it is very important that CSMs are feeding the data in a timeline on a
42:40
regular basis. So then
42:43
AI can identify and also can summarize all the things in a better way. But for
42:49
most of the things
42:50
that we've used so far, I haven't seen that there are some big discrepancies.
42:56
Yes, we are making sure
42:58
that CSMs are aware that they need to cross check if they want to dive into the
43:03
numbers.
43:04
They can cross check because, again, it pulls from all the timeline activities
43:09
and all the emails
43:11
that were fed already to gain side. So they need to be aware that there might
43:17
be discrepancies,
43:18
but still in overall, for most of the things, it is a very big help for both
43:24
them and also
43:25
directors who are also sometimes interacting with the customers. >> Awesome.
43:31
>> Oriel, I'll ask you this next one. I'm going to skip to, I think it's a
43:35
second on the screen.
43:36
You mentioned business cases earlier. What advice would you give to someone who
43:39
sees the value of
43:40
something like staircase but needs to position it internally to the finance
43:45
folks, the people with
43:46
the purse strings? >> Yes. Again, I use this framework all the time. On one
43:50
hand, there's cost
43:51
cutting. On the other hand, there's revenue enhancements. On the cost cutting
43:54
side, just think about
43:55
right now, just like, for example, feeding. So with staircase, you can minimize
44:01
the amount of
44:02
quote-unquote manual note-taking. So I think every digital interaction is
44:08
captured by staircase,
44:09
but if you went and had lunch or dinner with a customer, obviously staircase
44:12
doesn't know about
44:13
that. So we do, those notes should be diligently written by the CSM, but
44:17
typically, I would say
44:18
it's probably 80/20, 80% is digital, and you already removed hours and hours
44:25
from a CSM, and we can do
44:26
that calculation together. We've done that. The other, the same, I think was
44:30
mentioned here by
44:31
summarization, prep for meeting. So you can build a framework. I would kind of
44:35
look at a CSM as a
44:36
day in the life of the CSM and look how much time they're spending on various
44:41
tasks and see what would
44:42
be the impact of AI on those tasks. So prepping for meeting, let's say it takes
44:46
an hour, you know,
44:47
when in old school, maybe now it takes 10 minutes, you know, logging data on a
44:52
daily basis, so maybe
44:53
you spend two hours a day, maybe with AI, you now spend 30 minutes a day, and
44:57
so on and so forth.
44:58
So that would be the cost cutting side. On the revenue side, we have several
45:02
statistical models
45:03
based, and we can share them, based on historical examples and existing
45:08
customers,
45:09
about the propensity of saving an account. And just to kind of give you the
45:12
framework,
45:13
you know, churn has a lot of reasons, right? So I would say about a third of it
45:17
happens
45:17
due to, let's say, commercial, right, pricing, issues like that. A third
45:22
probably happens of
45:23
things that, you know, are market driven acquisitions. I don't know, companies
45:27
went out of business,
45:27
things like that. And a third is probably related to service and product, right
45:31
? And so that third
45:33
that's commercial and that third that's product and service, that's an area
45:37
where we can definitely
45:38
work with. So okay, so you say 60% of my churn, so I'm just going to run some
45:41
numbers. Say you have
45:43
a million dollars of churn or 10 million dollars in it. So 600,000 or 60% of
45:48
that is already something
45:49
you can work with. Now let's say you want to be very conservative out of that
45:53
60%, let's say
45:54
only about a third of that you can save. Again, so if it's $600,000 or $6
45:59
million, it's $200,000
46:00
or $2 million, that compared to the cost of the software.
46:04
Awesome. Well I think we are out of time. We have a couple good questions left,
46:09
so feel free to
46:09
find any of our panelists and ask those directly afterwards. So I want to give
46:12
everyone a huge round
46:13
of applause for all of your great insights. And I think we have a short break
46:20
between this and our
46:23
next session. So again, thank you all so much for sharing all of your great
46:26
insights, learnings,
46:27
and disinformation about AI in general. But thank you all so much. And thank
46:30
you so much everyone.