The WealthTech Podcast

Agentic AI and the Future of WealthTech | Dr. Sindhu Joseph

Mark Wickersham

In this episode of The WealthTech Podcast, host Mark Wickersham speaks with Dr. Sindhu Joseph, Founder & CEO of CogniCor, and a true pioneer in artificial intelligence for financial services. With a PhD in AI, over 20 years of experience, and multiple patents, Sindhu has been at the forefront of building practical AI applications long before large language models hit the mainstream.

She shares her founder’s journey—from growing up in India and seeing firsthand the impacts of wealth disparity, to launching CogniCor as a purpose-driven platform that empowers financial advisors to scale personalized advice and democratize access to wealth.

The conversation explores:

✅Why the wealth management industry needs an AI “co-pilot”

✅How CogniCor addresses tech fragmentation by creating an “intelligence layer” for advisors

✅The promise and risks of agentic AI, and what it means for day-to-day workflows

✅Guardrails firms should consider when adopting AI

✅The broader societal implications of AI—from job disruption to redefining the human role in a technology-driven future

If you’re curious about how AI will reshape wealth management, this episode offers both a visionary and practical roadmap.

About Dr. Sindhu Joseph

Dr. Sindhu Joseph is the Founder and CEO of CogniCor, a leading AI company serving the wealth management industry. With a PhD in Artificial Intelligence and over 20 years of experience in building AI solutions, Sindhu holds multiple patents in machine learning and natural language technologies. She launched CogniCor to address the fragmentation in financial technology and to create an industry-specific AI “co-pilot” that helps advisors reduce administrative burdens, scale personalized advice, and democratize access to wealth.

A recognized thought leader in AI, Sindhu has lived and worked across India, Europe, and the U.S., bringing a global perspective to how technology can transform financial services and society at large.

About CogniCor Company

CogniCor is an AI platform built for wealth management, founded by Dr. Sindhu Joseph. Its industry-specific AI “co-pilot” helps financial advisors cut through tech fragmentation, automate workflows, and deliver scalable, personalized advice. By acting as an intelligence layer across CRMs, planning tools, and portfolio systems, CogniCor empowers firms to serve more clients efficiently while democratizing access to wealth.

About The WealthTech Podcast:
The WealthTech Podcast is a bi-monthly interview series hosted by Mark Wickersham. Each month we present conversations with various industry leaders that focuses on the challenges family wealth firms face with technology, people and process. The podcast is produced by Brad Oliver.

The WealthTech Podcast is brought to you by the generous support of Risclarity. Risclarity fills the technology gaps family wealth firms face when serving the complex needs of ultra-high net worth families.

Disclaimer
Information provided is for educational purposes only. Opinions expressed and estimates or projections given are as of the date of the presentation there is no obligation to update or provide notice of inaccuracy or change.

The WealthTech Podcast Transcript

Host: Mark Wickersham

Guest: Dr. Dr. Sindhu Joseph, Founder & CEO CogniCor

Title: Agentic AI and the future of WealthTech

 00:00:04.550 --> 00:00:08.819

Mark Wickersham: All right, welcome to the podcast. And do I have

 00:00:09.080 --> 00:00:21.519

Mark Wickersham: been looking forward to this conversation. I want to get into all things. AI. I was wondering if you maybe could do a brief introduction of yourself and the firm that you founded.

 00:00:22.020 --> 00:00:44.899

Dr. Sindhu Joseph: Yeah, of course. Thank you so much for the invite. I'm excited to be here. My name is Dr. Sindhu Joseph. I'm the CEO and founder at CogniCor. My background is in AI. I spent probably around 1520 years building AI solutions. I have a Phd. In artificial intelligence, and also have invented several patents in machine learning and natural language, and all of those

00:00:44.970 --> 00:01:04.349

Dr. Sindhu Joseph: before the LLMs came out. So like, you know. So I'm passionate about this field. I have been in this for a long time, and very excited to see what is happening in this industry, and how this is reshaping, not just industries, but also the society as such.

00:01:04.349 --> 00:01:23.080

Dr. Sindhu Joseph: So I founded a firm 5 years back, called CogniCor. CogniCor is a co-pilot, not the Microsoft co-pilot. It's a co-pilot, specifically designed for the wealth industry for financial advisors, and I can go into why.

00:01:23.080 --> 00:01:37.139

Dr. Sindhu Joseph: I have founded the firm, and so on. Happy to, you know. Get into more details on that. But the co-pilot essentially, is a driver like, you know, kind of a companion for financial advisors to

00:01:37.390 --> 00:02:05.179

Dr. Sindhu Joseph: prioritize their like. The client, base or prioritize whom I should interact with today. And you know what are the needs that my, you know, client households have. And once I identify these needs providing operational efficiency, reducing administrative burden, but also scaling personalized advice. So these are like, you know, some of the

00:02:05.180 --> 00:02:15.969

Dr. Sindhu Joseph: key things the co-pilot does so. It is a, you know, industry, specific co-pilot that is built for wealth managers with the aim of like, you know, democratizing the access to wealth.

00:02:16.990 --> 00:02:22.160

Mark Wickersham: Love it. So you literally have a Phd in in AI. So I think that that's it.

00:02:22.290 --> 00:02:27.159

Mark Wickersham: A lot of I think information is gonna be gained from our listeners on this.

00:02:27.280 --> 00:02:34.710

Mark Wickersham: And it's interesting, this, this co-pilot concept how you're trying to amplify the advisor. Obviously, there's a lot of

00:02:34.910 --> 00:03:03.050

Mark Wickersham: advisors are struggling in terms of. There's not enough, not enough talent coming into the industry. Advisors are starting to age out. You start to have this giant wealth transfer that's happening at the at the same time. So it's really it's timely for that type of solution to help amplify their capabilities before we get into various dimensions of AI. Could you just talk to me a little bit about your founder’s journey? I'm always

00:03:03.190 --> 00:03:16.190

Mark Wickersham: fascinated by talking to founders about. Why did you start the business? What was the gap that you saw in the marketplace, and what have been some of the learnings and kind of struggles along the way.

00:03:16.590 --> 00:03:40.999

Dr. Sindhu Joseph: Yeah. So there's like a couple of motivations. Why, I started CogniCor. One is, of course, you know, given my AI background. You know, I was very, very motivated to kind of revolutionize industries, and I have seen what is the potential of you know how AI can make that change. So I was absolutely clear that you know this is

00:03:41.000 --> 00:04:05.119

Dr. Sindhu Joseph: something that I want to do. But specifically, wealth management, specifically financial services. So that was the primary motivation to make those changes in that industry, and you need to go back a little bit into my history to understand why. What is the motivation there? So I was born and brought up in India, traveled across Europe

00:04:05.120 --> 00:04:29.149

Dr. Sindhu Joseph: for around 10 plus years before moving to the Us. So I lived in 3 continents, interacted with a multitude of people and different sections of society so early on. What I kind of saw is, especially in India. You could see that very pronounced at that time when I was growing up is the wealth. Disparity, like the people in the

00:04:29.150 --> 00:04:58.409

Dr. Sindhu Joseph: one section of the community, is absolutely poor, and there is a, you know, a smaller section that is very rich, and when I traveled in Europe and now based in San Francisco and Palo Alto, I thought, you know those differences would kind of disappear, but you know it has only become more pronounced in the years that is recent.

00:04:58.810 --> 00:05:24.760

Dr. Sindhu Joseph: So I was really motivated to kind of see why this change, and why this disparity exists? And how can we bring more people into that comfortable place? So one of the things that I understood, was people work really hard to make money, and this is especially true in the Us. So it is not 

00:05:24.760 --> 00:05:47.350

Dr. Sindhu Joseph: their lack of working, so they all know how to make money. The difference between the wealthy and the poor is the majority of the population doesn't know how to put money to work for them. So if we have, and wealth managers, financial advisors are the instrument to, kind of, you know, putting money to work for people.

 

00:05:47.350 --> 00:06:12.010

Dr. Sindhu Joseph: And when I looked at financial advisors, they only address high net worth individuals, households, section of the population as well as affluent, and mass affluent, not even mass affluent like. So how can we kind of lower the barrier of entry, and, you know, cover more sections of the society, like, you know, from a 30% to 50% to 70 to 90%.

00:06:12.040 --> 00:06:38.190

Dr. Sindhu Joseph: So that is really my motivation. How can we bring everybody to this wealth creation where you know everybody can be part of that wealth, creation, exercise and actually democratize the access to wealth. So that's really my fundamental motivation. I would love to see a society that is more equalized in terms of the opportunity to create wealth.

00:06:38.190 --> 00:06:57.909

Dr. Sindhu Joseph: We give opportunity to, you know, work and create money. But you know we don't give the opportunity to create wealth. So that was absolutely my motivation. I'm not claiming that CogniCor will solve all of this. But I think if there is one technology that can solve. It's AI and

00:06:58.060 --> 00:07:11.130

Dr. Sindhu Joseph: AI can, like, you know, really take majority of the work from financial advisors and package that human AI collaboration into a great scalable capability.

00:07:13.070 --> 00:07:22.660

Mark Wickersham: It's an interesting concept, I mean, certainly. I think financial advisors. It's a noble profession. I think it's extremely important

00:07:22.780 --> 00:07:25.690

Mark Wickersham: profession. You do see

00:07:25.910 --> 00:07:40.230

Mark Wickersham: that level of financial advice does seem to be sequestered to, you know, ultra high net worth and firm struggle, especially the independent channel, to be able to service more mass, afflulent even just

00:07:40.230 --> 00:08:00.799

Mark Wickersham: younger. Next Gen. That maybe hasn't accumulated the wealth yet. That may end up being on that trajectory. But they don't have the assets at the early stage of their career to be able to qualify for whatever cutoffs these advisors have. So I think that's great. If they can lower those minimums, get a wider swath of the population, be able to get access to

00:08:01.030 --> 00:08:10.710

Mark Wickersham: the next generation younger. So they're able to make better decisions before they have wealth to be able to put them on a quicker trajectory. I think those are all.

00:08:10.980 --> 00:08:11.950

Dr. Sindhu Joseph: And I think

00:08:11.950 --> 00:08:23.580

Dr. Sindhu Joseph: it is also, you know, great Roi, you know the main argument is, it doesn't make Roi with AI. It can really make the Roi sense. So it is.

00:08:23.580 --> 00:08:46.390

Dr. Sindhu Joseph: There is around 12 trillion dollars left on the table that nobody is touching from a financial advisory perspective. You can really make that difference and make the Roi sense absolutely with AI. So that is like, you know, as you said, one of the key population that I want financial advisors to address is the younger generation, the college going

 00:08:46.390 --> 00:09:06.820

Dr. Sindhu Joseph: kids who has no clue. They really want to kind of start that financial discipline, but they have no clue where to start, how to start, and they often go into online forums where they can, you know, invest in, you know, starting from gambling to everything.

00:09:06.820 --> 00:09:08.010

Mark Wickersham: Like meme stocks.

00:09:08.010 --> 00:09:08.960

Dr. Sindhu Joseph: And loose.

00:09:09.255 --> 00:09:09.550

Mark Wickersham: And.

00:09:09.550 --> 00:09:15.239

Dr. Sindhu Joseph: Yeah, and they end up losing everything. So I, I really want to make a change in that space.

00:09:16.510 --> 00:09:19.150

Mark Wickersham: What makes CogniCor unique? 

00:09:20.441 --> 00:09:28.279

Dr. Sindhu Joseph: Yeah. So for that, you need to understand a bit the wealth industry assets like, you know, if you look at the the past

00:09:28.830 --> 00:09:50.320

Dr. Sindhu Joseph: 1020 years of development, and the tech stack that is built in the wealth industry is kind of like, you know, very, extremely fragmented. So for a financial advisor, for an end user or any other stakeholder in the industry, it is extremely hard for them to kind of

00:09:50.320 --> 00:10:09.450

Dr. Sindhu Joseph: deliver service that is enabled by technology. They are, you know, motivated people, individuals who wants to help their clients. But to be able to do that in a seamless way is extremely difficult today. And that's because of the tech fragmentation. So they need to start looking at

00:10:09.450 --> 00:10:32.080

Dr. Sindhu Joseph: at least 10 different platforms to get a unified view of the client. This is starting from their CRM. Where their you know, client data is stored to their planning system, their portfolio systems, their custodian platforms and whatnot like, you know their estate plan. So this is just the cream of the platforms, and if you go one down below.

00:10:32.080 --> 00:10:57.050

Dr. Sindhu Joseph: like, you know, there is a number of other platforms as well. So that is the state of the industry. Now, you know, AI tools can create insights and intelligence that can help advise us. But the way in which these tools are coming out in the market, like I'm sure one of the most popular use case for AI is a note taker. So you have an AI note taker

 

00:10:57.050 --> 00:11:10.299

Dr. Sindhu Joseph: for financial advisors. You have an AI Scheduler, you have an AI onboarding solution. So all of this are again fragmentation. Next layer of AI fragmentation that we are introducing into the market.

 

00:11:10.300 --> 00:11:11.673

Dr. Sindhu Joseph: I believe.

 

00:11:12.360 --> 00:11:30.710

Dr. Sindhu Joseph: Instead of like, you know. Let's look back after, you know 1020 years into the future we would see 2 layers of fragmentation. Now the advisors have to deal with, and it is, instead of AI solving problems. AI would be introducing problems there. So CogniCor or

 

00:11:30.710 --> 00:11:55.690

Dr. Sindhu Joseph: things that you know AI is capable of much more, and can actually solve the existing fragmentation and not introduce a new AI fragmentation. So that is why it is different. So we CogniCor wants to be the intelligent layer intelligence layer or the insights layer which sits between the data records layer and the engagement

 

00:11:55.690 --> 00:12:11.295

Dr. Sindhu Joseph: layer and connect all of these different sources together in a real time basis. Not as a you know, you know, you are not building a huge data lake or a data warehouse project where we are bringing all of the data together. But

 

00:12:11.950 --> 00:12:34.810

Dr. Sindhu Joseph: today, there are tools exist in the market that allows us to both read, structured, unstructured, semi-structured, everything in between. So you can actually connect these different sources together, bring the data together into one single intelligence layer where you are providing a holistic insights and recommendations, highlighting opportunities.

 

00:12:34.810 --> 00:12:45.180

Dr. Sindhu Joseph: highlighting risks and helping advisers to take action. So all of this can happen in a single intelligence layer. So CogniCor is the only solution in the market

 

00:12:45.180 --> 00:12:51.450

Dr. Sindhu Joseph: that provides this intelligence layer that creates that end-to-end experience for the financial advisors.

 

00:12:52.580 --> 00:12:58.459

Mark Wickersham: This intelligence layer goes over the top of all these applications. I guess that you know these advisors build a

 

00:12:58.580 --> 00:13:25.579

Mark Wickersham: a best of breed tech stack. They got a Crm to your, you know, multiple custodians, a portfolio accounting system, a financial planning system. Note taking. Obviously, that seems to be an early stage adoption where there's specific note takers out there that are, I think that the adoptions like 70% with note takers, although I'm not sure how those note takers are going to survive with with copilot coming out with their own capabilities.

 

00:13:25.580 --> 00:13:50.540

Dr. Sindhu Joseph: Yeah, I think the note takers are such, you know. If they are a point solution, it is hard for them to survive, and it should be the case, because it's like, you know, having imagine you have a traditional office and you're hiring an assistant. You wouldn't hire an assistant one for bringing your coffee another one, for you know, bringing your, you know, printed copies.

 

00:13:50.540 --> 00:14:01.260

Dr. Sindhu Joseph: another one for booking your tickets. You would hire one assistant who can do all of this, you know. Maybe interface with different people. But get all of these done.

 

00:14:01.260 --> 00:14:10.079

Dr. Sindhu Joseph: Why you want that single person, not just because you know you don't. Like, you know, you don't want to pay 10 different people, but also because, you know, this person needs to know

 

00:14:10.080 --> 00:14:32.750

Dr. Sindhu Joseph: the holistic view of you like you know, what are your preferences? How do you like to travel? When do you like to meet people? How do you want to take notes. All of this, this single person has the view of that, so that intelligence layer, that co-pilot is actually being that single assistant who has that holistic view.

 

00:14:32.750 --> 00:14:51.979

Dr. Sindhu Joseph: not only of the household of the advisor preferences, and then interfaces with different applications, different planning, custodian portfolio solutions, rebalancing all of those solutions to get things done in a proactive.

 

00:14:51.980 --> 00:14:57.349

Dr. Sindhu Joseph: personalized manner. So I think that should be the approach the industry should be taking.

 

00:14:58.370 --> 00:15:08.920

Mark Wickersham: Okay. So you, you have some basically assistant that works with all the different departments versus I have an assistant that works with just one department and having to hire 10 assistants.

 

00:15:09.390 --> 00:15:13.300

Mark Wickersham: That makes sense. How long does it take to stand up your solution.

 

00:15:13.300 --> 00:15:36.650

Dr. Sindhu Joseph: So it is. most of it is pre-built so it is a kind of a plug and play experience. Of course you need to like, you know, the more the level of integration, the better the experience would be, the better the insights and recommendations. So. For example, if we have a pre-built integration with one of the planning

 

00:15:36.650 --> 00:15:46.779

Dr. Sindhu Joseph: software, such as e-money, and in depending on the way each of these application providers work

 

00:15:47.214 --> 00:15:52.869

Dr. Sindhu Joseph: for e-money. We do need a specific Api key for every single client.

 

00:15:52.870 --> 00:16:17.689

Dr. Sindhu Joseph: Which means that once we get that Api key from that client. That's when, like, you know, we can actually make that specific integration for that client. So there is a little bit of that last mile configuration that we need to do, but beyond that it is, it is kind of a. It is meant to be a plug and play

 

00:16:18.001 --> 00:16:30.458

Dr. Sindhu Joseph: experience for most part of the solution. There is also another place where, you know, there is a lot of work that needs to be done, which is, we have a set of workflows configured in the platform, such as

 

00:16:31.200 --> 00:16:56.869

Dr. Sindhu Joseph: the onboarding workflows, or the account opening workflows, or the money movement, or the account maintenance, like beneficiary change, address, change those kinds of workflows. So everything is packaged into the platform. But today the state of the art of the industry is that every ria you would think that you know an onboarding process should be done, maybe in 5 different ways.

 

00:16:56.870 --> 00:17:14.839

Dr. Sindhu Joseph: but not more than that. But every single ria does that in, you know, if you have 20,000 rias in 20,000 ways. So there is a little bit of standardization that we ultimately want to bring it to the industry, like, you know, onboarding should be done in 10 different ways, and not more than that.

 

00:17:15.200 --> 00:17:39.350

Dr. Sindhu Joseph: We are in that journey. So we do customize to client preferences a bit and that takes time to customize the workflows. But our vision is to maybe 2 years down the line. You know, we can actually say that you know this is the standard way everybody accepts onboarding to be done in the industry. And this

 

00:17:39.350 --> 00:18:01.830

Dr. Sindhu Joseph: accommodates like, if you want to upload an account statement, you know it allows you to do that. If you want to fill out a form. It allows you to do that. If the advisor wants to do it, you know, it allows you to do that. So we kind of configure it in, you know, incorporating 5 or 10 ways, but that's pretty much it. That's how the industry should work, so that consolidation is yet to take place.

 

70

00:18:02.770 --> 00:18:04.680

Mark Wickersham: So yeah, the

 

00:18:05.030 --> 00:18:17.339

Mark Wickersham: prepackaged best practices like on how to do certain tasks right? And there's always advisors like, well, this is why we've always done it doesn't necessarily mean that that's the the best way to do it right. That's just.

 

00:18:17.340 --> 00:18:17.930

Dr. Sindhu Joseph: Yeah.

 

00:18:17.930 --> 00:18:24.070

Mark Wickersham: That historical precedent maybe had a reason in the beginning, but that that reason may no longer exist.

 

00:18:24.340 --> 00:18:29.739

Mark Wickersham: That's where you get into change management. Right change can be hard for these firms to be able to

 

00:18:30.010 --> 00:18:32.590

Mark Wickersham: overcome. How do you?

 

00:18:33.120 --> 00:18:51.459

Mark Wickersham: How does an advisor be able to? Is it? Is there a walk run, sprint type of methodology that you have in terms of adoption? Is it once everything's pre-configured that it's a big bang? Or how does that? How does the adoption process work.

 

00:18:51.760 --> 00:18:56.888

Dr. Sindhu Joseph: Yeah. So we, since we have, like, you know, a number of things that

 

78

00:18:57.270 --> 00:19:21.500

Dr. Sindhu Joseph: sometimes advisors don't even expect. So they don't even realize that most advisors and ris come to us, for you know still not taking. So we then tell them that you know not taking is not their only problem. You know, we redefine the problem statement for them. So that being said, like, you know, the way we roll out the solution is, we don't

 

00:19:21.500 --> 00:19:32.689

Dr. Sindhu Joseph: bombard them with every single thing like you know we do a phased rollout, where, like, you know, we provide the essentials and then build up from there. So in that

 

00:19:32.690 --> 00:19:40.000

Dr. Sindhu Joseph: process, the advisors? You know. Sorry. Can you repeat that question? What?

 

00:19:40.910 --> 00:19:41.580

Dr. Sindhu Joseph: Okay?

 

00:19:42.030 --> 00:19:43.440

Mark Wickersham: The adoption process.

 

00:19:43.440 --> 00:19:44.759

Dr. Sindhu Joseph: Yeah, so, so, yeah.

 

00:19:44.760 --> 00:19:47.889

Mark Wickersham: How do? How do advisors Dot? Adopt your platform, is it?

 

00:19:48.050 --> 00:19:51.160

Mark Wickersham: Walk, walk, run, sprint, or is it.

 

00:19:51.160 --> 00:20:11.489

Dr. Sindhu Joseph: Yeah, yeah, yeah, okay, yeah. So, yeah. So we kind of our onboarding process is like, you know, we again. This is one of the key differentiation that CogniCor has. Our we want to make sure that, you know philosophically, we don't want to provide a ton of the platform to the advisors.

 

00:20:11.490 --> 00:20:22.070

Dr. Sindhu Joseph: We need to go to where the advisors work today, and that was a very strict requirement from like that we imposed on ourselves, which means that

 

00:20:22.070 --> 00:20:35.399

Dr. Sindhu Joseph: we should roll out this product in the most used platform for the advisory world. That is the CRM. So it would be embedded into their CRM. So we usually start with the sandbox.

 

00:20:35.754 --> 00:20:58.790

Dr. Sindhu Joseph: Experience. So we deploy everything into their sandbox. And then there is a 1 or 2 advisors testing it to make sure. You know it is. Everything is working properly. End to end. You know their outlook is connected, their calendar is connected. Their money guide is connected. Their like, you know, their Orion portfolio system is connected. Everything is flowing smoothly.

 

00:20:58.790 --> 00:21:25.400

Dr. Sindhu Joseph: Once that happens, we push that into production for every all of the advices. It depends on the size of the organization. If it's a 20,000 people organization, we won't push that in one go, but, like, you know, that is a faced rollout. So that's that's the adoption process. And you know it is. It is rolled out in multiple steps. But it's there is always a staging environment and a production environment.

 

00:21:26.430 --> 00:21:42.319

Mark Wickersham: Talk to me about the relationship between AI and data. Obviously, you know, having good data, quality data is is key to being able to get the results that you want from AI. But talk to me a little bit about that relationship.

 

00:21:42.916 --> 00:22:06.070

Dr. Sindhu Joseph: So I would, you know, start by highlighting that you know it's A, you know AI and data. Ultimately, you know, there is no separation of these 2 things like, you know, it is one directly influences the other, and also the reverse. There was a much more codependency, or, like, you know, the lack of data

 

00:22:06.070 --> 00:22:24.440

Dr. Sindhu Joseph: was a huge impediment in terms of deploying AI solutions in the past. There is a changed relationship between AI and data today. And that's because AI has progressed a lot in the past few years. So 1st of all.

 

00:22:24.440 --> 00:22:36.880

Dr. Sindhu Joseph: previously, like, we needed to have good, clean data to be able to run AI algorithms today, the available range

 

00:22:36.880 --> 00:23:03.939

Dr. Sindhu Joseph: and where AI can look has hugely expanded. So, for example, like, let's say, you know, you had a client conversation where you talked about different things, your dreams, your goals. In the next 10 years. You also talked about your family members, your pet, your fears everything in that 1, 2 h conversation with the Financial Advisor.

 

00:23:04.110 --> 00:23:29.029

Dr. Sindhu Joseph: financial advisors as they are diligent, they can also be careless. They probably didn't put the notes anywhere in, so that conversation was lost except for the, you know, transcript or recording that was there. So if that is the case, AI can actually go and look into those transcripts, look into those recordings and bring back all of the information that was discussed in

 

00:23:29.030 --> 00:23:36.970

Dr. Sindhu Joseph: that meeting. So accurate data is important, but it doesn't need to be stored

 

00:23:37.350 --> 00:24:04.749

Dr. Sindhu Joseph: in places that we expect expected previously. So it doesn't need to be like, you know, the family Member information need not be added to a CRM. It could be, you know, stored somewhere, you know, as a record recording or as a transcript, or as a document anywhere. So so that's 1 relationship change that because of the AI advancement that has happened the second is

 

00:24:05.760 --> 00:24:25.420

Dr. Sindhu Joseph: the ability for AI to generate new data. And the prime example is the note taking so previously, you know, one of the main things that AI was not adopted for recommendations for insights was like, you know, I, my financial advisors, don't take good notes, and I don't have good data. Now. AI is going.

 

00:24:25.420 --> 00:24:28.190

Mark Wickersham: They don't take good notes, and they don't put it in the system for sure.

 

101

00:24:28.190 --> 00:24:52.139

Dr. Sindhu Joseph: Yeah. Yeah. So AI is going to do that for them and capture all of these nodes. And it will, you know, summarize it properly, it will put it there. So the ability for AI to generate that data, capture those interaction moments, and, you know, be able to do that, and likewise in every other steps, like, you know, if it's a plan like, you know, you can actually

 

102

00:24:52.210 --> 00:24:59.069

Dr. Sindhu Joseph: go into that plan. Read that plan, you know. Understand the probability of success when it is changed.

 

103

00:24:59.070 --> 00:25:24.050

Dr. Sindhu Joseph: or, like, you know, let's say the household is spending a lot more in the past few months, so immediately we can go into portfolio systems and understand this new pattern and then alert that you know you are spending more. You're not contributing enough to the plan that we have put together. Now, the probability of success suddenly changes

 

104

00:25:24.050 --> 00:25:48.440

Dr. Sindhu Joseph: from 95 to 83. So that means, like, you know, there is an action that needs to be taken based on that. So those kinds of things AI can go proactively and update that plans. So I think there is a lot more interconnectivity there that allows.

 

105

00:25:48.840 --> 00:26:16.790

Dr. Sindhu Joseph: While data, good data, accurate data is still the crux of every insights and every recommendation, every actions that needs to be done. AI has a much more pronounced role in creating this data and taking advantage of data that is sitting in places that was previously unknown, or are not accessible to the AI platforms.

 

106

00:26:18.000 --> 00:26:28.899

Mark Wickersham: How do you guard against AI creating? You know, the the hallucinogenic data that making stuff up or or crossing references or or whatnot.

 

107

00:26:29.430 --> 00:26:56.420

Dr. Sindhu Joseph: Yeah, I think there is a there is a risk. there especially in the broader domain, like, you know, when you open up for your AI platform to like a chat. Gpt, kind of environment, like, you know, it's an open domain like you can pretty much ask any questions. You are not giving a you know, reference point where you need to have the information from

 

108

00:26:56.500 --> 00:27:04.330

Dr. Sindhu Joseph: AI can absolutely hallucinate. But we are talking about a wealth industry where, if

 

109

00:27:04.330 --> 00:27:29.130

Dr. Sindhu Joseph: I'm not so much talking about training Llm. Models, because I think you know, Llm. Models have their own cycles of training, releasing new models, and that is taken up by a few of the, you know, large tech firms in the industry. They are doing their job in terms of doing that. Our job mostly is, instead of, you know, looking

 

110

00:27:29.130 --> 00:27:53.320

Dr. Sindhu Joseph: to build an Llm. Model, that is, you know, customized for the wealth industry. We do take the generic LLMs. But then constrain that to only reason on the data that we provide so absolutely like, you know, any platforms including cogni course, we make it a priority for our platform.

 

111

00:27:53.320 --> 00:28:02.249

Dr. Sindhu Joseph: To, you know, you give a recommendation. You give you take a note, and you know, generate action items from there

 

112

00:28:02.250 --> 00:28:24.779

Dr. Sindhu Joseph: everything is traced back to the source. You can absolutely trace it back to, you know, a document information that is discussed in a meeting or information that is available in a plan document and things like that. So traceability gives you a lot more confidence and trust in what are the insights that are

 

113

00:28:24.780 --> 00:28:48.490

Dr. Sindhu Joseph: provided by AI. I think that is an absolute characteristics that everybody should look into before they hire any AI solution, and I'm sure Acc will come out with those guidelines in terms of traceability. So with that, I think we can eliminate a lot of hallucination from AI, because you are saying that

 

114

00:28:48.490 --> 00:28:51.019

Dr. Sindhu Joseph: this is your data domain like, you know, this is the only

 

115

00:28:51.510 --> 00:29:03.590

Dr. Sindhu Joseph: can use to derive any insights or any action that you know you are proposing, instead of relying on your own like, you know your own open intelligence.

 

116

00:29:04.650 --> 00:29:23.530

Mark Wickersham: How do with with family offices and advisors that are looking to adopt? AI. Let's just talk about in general, what what should they be thinking about in terms of of creating an AI adoption policy. How can they create certain guardrails? What path would you recommend that these firms be able to.

 

117

00:29:23.530 --> 00:29:37.120

Mark Wickersham: you know, properly adopt AI and be able to experiment with AI without kind of putting their their clients data in into, you know, revealing personal private information into a broader public. Llm.

 

118

00:29:37.650 --> 00:30:02.210

Dr. Sindhu Joseph: Yeah. So 2 things. One is like, you know, they need to be absolutely understanding of, like, you know, what are the user experience? What are the advisor experiences that they want to provide to their advisors, to their users? So I think everybody has to start from there instead of starting with, oh, I need to adopt an AI solution like, you know, AI is just a tool

 

119

00:30:02.210 --> 00:30:26.529

Dr. Sindhu Joseph: like, and I shouldn't be thinking from a tool standpoint. I should be thinking from like, you know, this is what the experience that I want to deliver to my advisors to my clients, and what will enable me to do that. So, having that clarity will guide all of your decisions very, very clearly. So I think that's a starting point for every CIO CTO,

 

120

00:30:26.530 --> 00:30:35.380

Dr. Sindhu Joseph: or like, you know, CEO, trying to roll out an AI solution for the wealth advisors. Second, is like, you know.

 

121

00:30:35.380 --> 00:30:40.810

Dr. Sindhu Joseph: today, these tools are very, very

 

122

00:30:41.370 --> 00:31:04.309

Dr. Sindhu Joseph: available in the market, like, you know, in in the sense chat, Gpt is accessible to pretty much everyone like, you know, especially the free version. So it's very tempting for clients as well as for advisors to put their clients data to, you know. Get a document prepared, for example, for to be sent to the client.

 

123

00:31:04.310 --> 00:31:28.509

Dr. Sindhu Joseph: or you know what recommendations should I give it to the, you know client in this scenario? So you know, copy all of the client data, put it into Chat Gpt, it will spin out a number of recommendations. So that is amazing. So these things are tempting. It's out there. But you know also be. You know, mindful that this data will go into, not only into the into the domain of Chat Gpt, but also

 

124

00:31:28.690 --> 00:31:48.693

Dr. Sindhu Joseph: train chat, Gpt algorithms. So I I think you know, that's where we need to have that absolute guardrail policy in terms of making sure that they are not using these kind of platforms. There are options there, like if you

 

125

00:31:49.560 --> 00:32:01.970

Dr. Sindhu Joseph: take a paid version of Chat Gpt, and make sure that you know you are you in the configurations you can put in that. You know your data should not be used anywhere. I think those are

 

126

00:32:01.970 --> 00:32:19.579

Dr. Sindhu Joseph: better configurations that you can do. I'm not sure, like you know what happens behind the scenes. I haven't looked at the contractual agreements that Chat Gpt has. But at least those minimums. You you are, you know, required to do, obliged to do from a client perspective.

 

127

00:32:19.910 --> 00:32:44.819

Dr. Sindhu Joseph: But even beyond that, I think we need to look at where the client data resides. And you know, if that is isolated from other client data. So that's the you know, number one thing that we need to look at, how, how the data you know these are, you know, basic saas hygiene that you know every other industry adopts as well, the second is particularly to AI is

 

128

00:32:44.900 --> 00:33:11.089

Dr. Sindhu Joseph: how my data is being used to improve the performance of the AI. So I think, while everybody claims that you know they are fine tuning and retraining the LLMs. There is no retraining happening in LLMs, because LLMs are huge models that are trained by these large tech platforms.

 

129

00:33:11.090 --> 00:33:27.790

Dr. Sindhu Joseph: and that happens that are done by them. There is no training happening in individual application layer. There is fine tuning happening or prompt adaptation that is happening. So you can put in a lot of

 

130

00:33:27.790 --> 00:33:52.749

Dr. Sindhu Joseph: client intelligence to make the product work for you in a different way. So that's where you need to make sure there is contractual agreements in place where you have to. You know, it is only anonymized client data that would be used to train these algorithms. And you can also, you know, put in place different parameters like, you know, I don't want my secret sauce to be

 

131

00:33:52.750 --> 00:34:20.259

Dr. Sindhu Joseph: trained by AI, so you can separate those out in terms of getting those information into AI. But there is a lot of benefits as an industry gets by AI understanding the nuances of the customer behavior and things like that in a broader landscape, and I think clients should be open to doing that.

 

132

00:34:20.260 --> 00:34:29.999

Dr. Sindhu Joseph: of course, with anonymized data. But that will give us, as an industry, a leap forward in terms of AI, providing better intelligence and insights.

 

133

00:34:31.630 --> 00:34:33.609

Mark Wickersham: We're obviously.

 

134

00:34:33.989 --> 00:34:56.059

Mark Wickersham: you're talking generative. AI. But now it's moved at the latest phase is is a genic. AI sort of example would be not only taking the notes, but then creating action items and workflows off of that talk to me a little bit about agenic AI, and how that it that's embedded into your platform, and the power that has for advisors.

 

135

00:34:57.062 --> 00:35:04.460

Dr. Sindhu Joseph: So the funny stories I did. My Phd, more than 10 years back in multi-agent systems.

 

136

00:35:05.217 --> 00:35:10.073

Dr. Sindhu Joseph: Where we had multiple agents working together to

 

137

00:35:10.820 --> 00:35:13.686

Dr. Sindhu Joseph: to take, you know, to

 

138

00:35:14.280 --> 00:35:38.760

Dr. Sindhu Joseph: obey certain norms that are imposed by that agent society and take different actions. So it is a concept that has been in the academic space for a very, very long time, and because of my background in that space our core system is built on a multi agent framework before the agentic approach came

 

139

00:35:38.760 --> 00:36:02.399

Dr. Sindhu Joseph: into mainstream. So I am a big believer in that. I think the best way of organizing an artificial intelligence system is using this agentic approach because you can delegate it mimics how humans work it delegates tasks to separate agents who are specialized in doing certain tasks.

 

140

00:36:02.805 --> 00:36:27.520

Dr. Sindhu Joseph: or doing certain functions. And then, you know, they coordinate together. So I think it's a great framework to to be used. So it also, like, you know, for example, like CogniCor's platform, is organized with multiple agents that work together such as like, you have a planning agent, you have a portfolio agent, you have a custodial agent. You have

 

141

00:36:27.520 --> 00:36:47.130

Dr. Sindhu Joseph: agent that reads unstructured data. So all of these things come together to provide insights and intelligence and take actions. So the great thing about agents is you only need to provide a schema for them to operate like, you know, you

 

142

00:36:47.130 --> 00:36:59.590

Dr. Sindhu Joseph: can read an external system by looking at. Okay, this is this, the schema of that particular system? So for all of the information that

 

143

00:37:00.046 --> 00:37:13.199

Dr. Sindhu Joseph: I need, I can actually rely on this particular system, because the schema says that you know that information is available there, so that schema based approaches, like, you know, very, very easy for agents to work with

 

144

00:37:13.200 --> 00:37:32.939

Dr. Sindhu Joseph: in terms of like, you know, deciding, where do I go and look for information? Where do I? You know which which agent would I call in terms of executing certain actions like, if I have a workflow agent, I can actually know that this agent publishes that you know my capabilities are these. That's nothing but the schema. So, using this.

 

145

00:37:32.940 --> 00:37:34.779

Dr. Sindhu Joseph: I can actually go and

 

146

00:37:34.780 --> 00:37:57.210

Dr. Sindhu Joseph: call this agent when a workflow needs to be executed, which could be an onboarding agent which could be an account opening agent or things like that. So everyone says, Hey, you know this is what I can do. And then the orchestrator agent, looks at the problem at hand, creates a thought process of how to solve this problem and then realizes.

 

147

00:37:57.210 --> 00:38:19.805

Dr. Sindhu Joseph: yeah, I need to make use of the onboarding agent to to get that onboarding work done, and delegates the work to that particular agent. So it is. It is beautifully done. I think there is still nuances in, you know, getting the system to scale and all that. But we we are very much

 

148

00:38:20.450 --> 00:38:29.829

Dr. Sindhu Joseph: in the right direction in terms of, you know, getting there and making sure that this kind of systems perform at the scale that we want.

 

149

00:38:30.880 --> 00:38:50.108

Mark Wickersham: I mean the beauty of the the these agencies that they don't forget right that you're not gonna have a meeting, and then you're gonna forget to do one out of the 10 tasks that that those tasks are gonna be either completed or you're gonna be notified of some sort of issue with completing that. That. So I think that's the beauty of it.

 

150

00:38:51.620 --> 00:38:57.679

Mark Wickersham: Wicker shame, household! I could use an agent to unload the dishwasher in the morning. I haven't found yeah

 

151

00:38:59.610 --> 00:39:01.069

Mark Wickersham: nails every morning, but.

 

152

00:39:01.070 --> 00:39:30.469

Dr. Sindhu Joseph: Unless they hallucinate, I think. You know. Recently I had a funny experience with Chat Gpt. I was trying to get a video done, and I asked the chat Gpt to create a script for me. It created a beautiful script. And then I said, you also, you know, make this videos. I said, the chat Gpt said, absolutely, I'm going to. You know, I can make this video for you. Give me all the artifacts. And I uploaded the artifacts, and then

 

153

00:39:30.787 --> 00:39:36.190

Dr. Sindhu Joseph: then it said, you know it would take 2 to 5 business days, and I'll come back

 

154

00:39:36.230 --> 00:40:04.110

Dr. Sindhu Joseph: after that. So we negotiated a bit in terms of the timelines, and it kept me like, you know, I it for me. It was a fun experiment to kind of understand what it would do but then it had no capability to build that video video from scratch. But it, you know, kept giving me timelines and deadlines. And you know, process that it is completing and all that. So it was. So unless the agent hallucinates, you know everything.

 

155

00:40:04.650 --> 00:40:11.827

Mark Wickersham: I feel like Chat Gpt is loosening more and more these days. I try to get it to create scripts and graphics and

 

156

00:40:12.210 --> 00:40:31.529

Mark Wickersham: try to be specific on my prompts, and sometimes I have great results. And then other times. It's just like it's not what I asked for, and it wasn't, or mixed up like, I asked it for 10 podcast guests that I should have on the, on the show, and and literally made up 3 of them like those people, didn't exist right? So

 

157

00:40:31.580 --> 00:40:58.459

Mark Wickersham: which brings me to the next thing. How should the next kind of frontier, or what is the next frontier? And also let's talk about also decision making. It seems like, obviously a human needs to be in the loop, like decision making is, is not there yet? Where? Where do you see the next evolution of AI. What's on the horizon that that you've been working with, that that, you see, is out there that maybe we don't know about.

 

158

00:40:59.070 --> 00:41:07.915

Dr. Sindhu Joseph: Yeah, I I think. You know. Decision making. To certain extent it is there. But you know it is our

 

159

00:41:09.320 --> 00:41:23.860

Dr. Sindhu Joseph: it is understanding nuances of the physical space. And you know, taking that right decision in the social construct. For example, can we replace.

 

160

00:41:23.950 --> 00:41:43.110

Dr. Sindhu Joseph: you know State or Federal judge today as an AI, you can absolutely do that. But do you want to do that? No, for the foreseeable future you don't want for 2 reasons. One is you want somebody to be responsible for that decisions to, you know, to have that

 

161

00:41:43.548 --> 00:41:59.850

Dr. Sindhu Joseph: empathy playing into the space. And you know, to have the social construct being taken into the account and things like that. So if you want to have an unemotional like, you know, purely based on data.

 

162

00:41:59.850 --> 00:42:00.390

Mark Wickersham: Thanks, too.

 

163

00:42:00.390 --> 00:42:30.219

Dr. Sindhu Joseph: You can. Still, you know, kind of rely on an AI agent. So I always believe that you know humans are. I'll give you another example like, you know. when I get emails or documents from my colleagues or others, which is obviously chat, gpt, you tend to kind of gloss over it. You tend to ignore it like you don't read it.

 

164

00:42:30.220 --> 00:42:30.780

Mark Wickersham: Yeah.

 

165

00:42:30.780 --> 00:42:38.480

Dr. Sindhu Joseph: Whereas, like, you know, if you write with your own like like.

 

166

00:42:38.480 --> 00:43:01.896

Dr. Sindhu Joseph: and imperfections like it is, it is a beauty to read, and you feel connected to that person. So I think, one of the great characteristics of humans is the ability, our ability to err. So we? We have that capability of imperfection, and I think that beauty of and I believe it is a it is a

 

167

00:43:02.360 --> 00:43:20.600

Dr. Sindhu Joseph: instead of a flow. It is an asset for humans. And so, if we can combine that asset with the excellence of AI, I think there is a collaboration that can last so in terms of decision making. I think

 

168

00:43:20.600 --> 00:43:47.259

Dr. Sindhu Joseph: this is one of the human in the loop approaches, especially for our industries, where regulation is one of the key facets that we need to look at. It's always a human in the loop approach that makes sense also because of the our human nuances that I described. I think the client interactions where the ability for us to empathize the ability for us to

 

169

00:43:47.970 --> 00:44:12.649

Dr. Sindhu Joseph: provide that advanced communication. I think all of that is very critical to sustain that relationship. So humans are great at some of the things, and we need to, you know, elevate their greatness from that standpoint that said in the AI space. I think the agentic space agentic

 

170

00:44:13.110 --> 00:44:38.049

Dr. Sindhu Joseph: capability is what is going to kind of drive a lot of a lot more excitement in the future which has not just the capability to extract and generate intelligence, but to do actions we have, you know, kind of dipped our toes into the action space. But I think there is a huge opportunity there in the future

 

171

00:44:38.050 --> 00:44:49.290

Dr. Sindhu Joseph: where actions can be done by agents. It has the last mile problem where, like, you know, you need to have connectivity with the different systems. You need to kind of

 

172

00:44:49.690 --> 00:44:58.499

Dr. Sindhu Joseph: start to have a common interface for similar systems, and so on, and the physical you know, capability. When you are

 

173

00:44:58.500 --> 00:45:22.610

Dr. Sindhu Joseph: ordering a pizza like, you know, the pizza has to be delivered to the person's house at certain time you can involve a human in the loop to alert this pizza to be delivered, or you can have a drone to get the pizza delivered completely managed by an AI system. So like, you know, you have that future vision of that

 

174

00:45:22.610 --> 00:45:44.443

Dr. Sindhu Joseph: that state that you can envision where AI is running a lot more of the day to day where the human's role is, you know, defined. But you know it is limited as well. So that's where I see AI is going. And there is a partnership that

 

175

00:45:45.180 --> 00:46:03.530

Dr. Sindhu Joseph: I'm not sure it is not yet in the limelight, but I would like to see in the, in the limelight between AI and Blockchain, where you can create that trust, AI execution and trust of blockchain together. You know it. It will

 

176

00:46:03.530 --> 00:46:20.929

Dr. Sindhu Joseph: create amazing experiences that you can trust because AI brings an inherent mistrust. Block trust brings an inherent trust. And this partnership, I think, is if we can really create. I think that's an amazing experience.

 

177

00:46:22.660 --> 00:46:42.640

Mark Wickersham: What in terms of AI and AI's impact on on society. And then there's obviously that the negative effects of of AI certainly cyber security is one of them. But where do you? Where do you see the the potential for for AI to go off the rails.

 

178

00:46:44.002 --> 00:46:58.100

Dr. Sindhu Joseph: Where? I you know, the question could be rephrased as where I don't see the potential. I think you know, as any technology that humans create. There is a bigger potential for it to create

 

179

00:46:58.100 --> 00:47:25.679

Dr. Sindhu Joseph: malware and create disruption, create negative use cases. Then you know, it is. It is just our human will that will and and the regulations will, you know, help us stay the course. But you know both humans as well as ais inherently, can deviate to those parts with that said, like, you know what I want to kind of highlight. Here is

 

180

00:47:25.680 --> 00:47:51.600

Dr. Sindhu Joseph: one of the key things that you know. We will see in the next few decades is while it will create a lot of you know jobs, and there will be a flurry of activity wherein these AI systems has to be deployed in a lot of the industries that we work today. Then, like, you know, I see also a possibility of a massive

 

181

00:47:51.680 --> 00:48:07.920

Dr. Sindhu Joseph: job reduction in terms of because of AI you know, they would take over some of our. You know what we have been humans have been doing like. I kind of philosophize a bit around that in the sense of

 

182

00:48:08.950 --> 00:48:33.319

Dr. Sindhu Joseph: when we became like, you know, we were hunter gatherers in the beginning. And then, you know, industrial revolution happened. Internet revolution happened all that. We became a cog in the wheel. Like, you know, we became workers who you know, just wake up in the morning and you know, do a repetitive task till the end of the day, and then, you know, go back and sleep.

 

183

00:48:33.320 --> 00:48:50.209

Dr. Sindhu Joseph: I don't think humans are meant to do that. You know, we exist in this world with a much more holistic personality beyond our work association that we today give, I think, as a society, we need to rethink in terms of

 

184

00:48:50.230 --> 00:49:03.130

Dr. Sindhu Joseph: what is humans role like, you know, how can we live in a society that is redefined by AI. And what is the best way like, you know, I think it's a huge liberation for us from that industrial revolution

 

185

00:49:03.130 --> 00:49:23.530

Dr. Sindhu Joseph: to eliminate, to remove us from that cog in the wheel approach. And then now we can. You know we are free to define what we want to do as a as a human being. How do we want to define our identity? Do we want to distance, or, you know, distance a bit from our like, you know, work identity, you know. If, when I introduce myself, I am

 

186

00:49:23.530 --> 00:49:33.689

Dr. Sindhu Joseph: primarily saying that you know I'm a Phd. In artificial intelligence, and CEO and founder of CogniCor, or, like, you know, do I have a broader definition of myself that

 

187

00:49:33.690 --> 00:49:45.520

Dr. Sindhu Joseph: includes much more? You know, different aspects that I can, you know be proud of, and I can exercise more often. So I think AI gives

 

188

00:49:45.520 --> 00:50:09.169

Dr. Sindhu Joseph: us the broader society that space to rethink, and governments, I think, to rethink what is our role and how we can best use of this AI revolution. I'm not sure if we are doing enough of that, it is a great opportunity, and you know I think we should take it up.

 

189

00:50:10.330 --> 00:50:13.120

Mark Wickersham: I think it's a good point. I think you look at the you know

 

190

00:50:13.180 --> 00:50:35.430

Mark Wickersham: how many, how much of the population was tied to agriculture. In the beginning, you know, the Us. Is still one of the leading agricultural output nations in the world, but very few people work in agriculture. Manufacturing is somewhat the same way with robotics. Now you're starting to see, you know, back office reconciliation. Some of these jobs, that

 

191

00:50:35.430 --> 00:50:52.150

Mark Wickersham: accounting which have been really have been hard to staff and hard to to retain talent. There, that does that get those jobs get eliminated, or at least you need fewer of them. And those people that are are doing it, or using AI to be able to do those things.

 

192

00:50:52.180 --> 00:50:56.680

Mark Wickersham: I do get worried with somewhat at a lack of

 

193

00:50:56.920 --> 00:51:06.839

Mark Wickersham: regulation. I think if you looked at what happened with social media, I think social media is as much of a negative force in the world as it is a positive force in the world. And there, there's been a real lack of

 

194

00:51:07.070 --> 00:51:20.269

Mark Wickersham: of regulation around there, and and you can see, especially in the Us. As a tendency that do the same with with AI. So I you know I do have some concerns with that, but you can see the potential to unlock

 

195

00:51:20.688 --> 00:51:32.030

Mark Wickersham: you know what is the nature of work, you know, and and how does that affect me as a as a human being? And there is historical precedence that kind of look back on on that as well.

 

196

00:51:32.580 --> 00:51:33.050

Dr. Sindhu Joseph: Yeah, I do

 

197

00:51:33.900 --> 00:51:59.030

Dr. Sindhu Joseph: just regular that. we should look at. It is also like, you know, as technologists, we get excited about the next incremental progress. Like, you know, we think when social media came around, every incremental progress was, every incremental step was defined as a progress. And we often tend to kind of look at.

 

198

00:51:59.609 --> 00:52:21.539

Dr. Sindhu Joseph: The near term incremental progress rather than looking at longer term. Where is all this going? So we we often ignore the impact on, like, you know, people's lives because of this and us, since the Silicon Valley is driving all of this. You know, it is very focused on

 

199

00:52:21.660 --> 00:52:50.080

Dr. Sindhu Joseph: technological advance, and everybody gets excited as technologists that we are making progress. And as long as we are making progress we can make it faster and faster, and we get more and more excited about it. And that's that's the only aspect that we look at. But you know, I think, looking at the impact, not just through regulation, but you know, people having a broader outlook of you know, how can I

 

200

00:52:50.360 --> 00:53:05.579

Dr. Sindhu Joseph: design these systems to to make sure it has a positive impact on society rather than you know. Driven by profits. And so creating more shareholders in the ecosystem is what I think is needed.

 

201

00:53:06.200 --> 00:53:21.849

Dr. Sindhu Joseph: it is today. It is the society is not a shareholder in any of the you know Googles or Facebooks of the world, and that needs to be a shareholder. And I think that will change the thought process a bit.

 

202

00:53:22.790 --> 00:53:27.015

Mark Wickersham: Yeah. And still, in a lot of ways they bear the bear the cost of it.

 

203

00:53:27.550 --> 00:53:35.859

Mark Wickersham: you're a self-declared AI junkie. Use AI all day in your personal life. What are, what are some of the ways in which you use

 

204

00:53:36.080 --> 00:53:42.210

Mark Wickersham: AI in your in your personal life, that that people might not be aware of that that could be useful for them.

 

205

00:53:43.350 --> 00:54:09.769

Dr. Sindhu Joseph: I am like, you know. I would say, part AI junkie part very, very human in the sense. I try to distance myself from the like again. My background is. I grew up in a hill station as remote as you can imagine unfolluted by anything that is

 

206

00:54:10.841 --> 00:54:16.979

Dr. Sindhu Joseph: you would see in in a developed place. So I have

 

207

00:54:17.090 --> 00:54:25.576

Dr. Sindhu Joseph: part of me always calls me back to nature, and you know, part of me is I am. I am working with AI, so I I kind of

 

208

00:54:25.880 --> 00:54:47.980

Dr. Sindhu Joseph: at times conflicted, but most of the time I think you know these 2 worlds live together for me, and I have a philosophy of, you know, bringing these 2 together. That's why, you know I have taken the approach of AI liberating humans, and, you know, bringing them close to nature than you know any other. Any technological advances that I'm excited about.

 

209

00:54:48.543 --> 00:54:51.360

Dr. Sindhu Joseph: So that's my posture always.

 

210

00:54:51.716 --> 00:54:57.783

Dr. Sindhu Joseph: So I am. I do use AI a lot in my work. I think I become

 

211

00:54:58.140 --> 00:55:21.629

Dr. Sindhu Joseph: at least 50% more productive. Using AI in like, for example, if I want to. Prepare for a for a podcast or, like, you know, prepare for a webinar, or like in a speech, or like, you know, any of those things AI is doing the groundwork for me, like now I'm only doing last mile. Everything else is done by AI

 

212

00:55:22.026 --> 00:55:36.678

Dr. Sindhu Joseph: if I have to strategize like, you know, if I have to close the sale like, you know, I give some inputs. But AI gives me a strategy in terms of how to do that. So

 

213

00:55:37.660 --> 00:55:42.878

Dr. Sindhu Joseph: I think every every aspect of my work life is

 

214

00:55:43.800 --> 00:56:00.969

Dr. Sindhu Joseph: except for the last mile, everything is done by AI so that's how I survive. So I think, AI can really make that change and and I, you know, definitely use that time for being less busy.

 

215

00:56:01.864 --> 00:56:06.690

Mark Wickersham: Do you? When you use AI, do you say, please, and thank you.

 

216

00:56:07.115 --> 00:56:17.759

Dr. Sindhu Joseph: Yeah, I I don't, because I I don't see AI as a as a person. And I. I am very conscious of

 

217

00:56:17.950 --> 00:56:28.350

Dr. Sindhu Joseph: typing in the next sentence into a Llm model, because I know it is an environmental polluter. So.

 

218

00:56:28.350 --> 00:56:31.060

Mark Wickersham: Include it clouds up the prompt. Yeah.

 

219

00:56:31.060 --> 00:56:54.630

Dr. Sindhu Joseph: Exactly so if I you know also, even though I have the urge to, you know, maybe for curiosity. I want to look up something. I kind of hold that urge, and I only use it when I have to absolutely use it, and I try to use it in an optimized way, so that I am reducing the number of

 

220

00:56:54.962 --> 00:57:13.930

Dr. Sindhu Joseph: you know, queries or number of prompts that I'm sending out. This is just a personal discipline that I follow simply because you know, it has. People don't realize because, you know, it's just interaction with the computer what goes on behind. And the environmental impact that it has in terms of executing these queries.

 

221

00:57:13.930 --> 00:57:14.460

Mark Wickersham: Right?

 

222

00:57:15.961 --> 00:57:34.839

Mark Wickersham: So I love Danny's Podcasts on a personal note. I just wondering. You, obviously, you've been able to live in a lot of different places across the globe. India, Spain, San Francisco. What do you love about each of those places, and and what are some of the differences that people might not be aware of.

 

223

00:57:35.820 --> 00:57:37.506

Dr. Sindhu Joseph: Oh, it it

 

224

00:57:38.840 --> 00:58:05.508

Dr. Sindhu Joseph: there, we probably need another podcast. To discuss this. But you know, I, I think I can probably highlight one, you know. Great aspect of each of the place. Maybe starting with India, India is india is known for a lot of things or like, you know, people also has certain

 

225

00:58:06.480 --> 00:58:24.090

Dr. Sindhu Joseph: feelings towards you know how Indians appear in front of the global population. But I think one of the lesser known facts about India is, you know, we have a deep history and background of spiritual focus.

 

226

00:58:24.503 --> 00:58:50.570

Dr. Sindhu Joseph: And you know, everything goes back to. You know our different levels of happiness. That that is very, very rooted in daily life, like, you know, in terms of like you know how we can elevate our happiness by helping others, and doing yoga and meditation and things like that. So I hold that very, very dear to my heart in in

 

227

00:58:50.942 --> 00:58:56.910

Dr. Sindhu Joseph: in the, in the Us. You can be caught up in the consumerist world for happiness.

 

228

00:58:56.910 --> 00:59:23.930

Dr. Sindhu Joseph: You know we have the exact opposite philosophy in India. It is today influenced a lot by the Western, you know, behavior, western philosophy. But in India we grew up in in an opposite world where you know the least things you possess, the most happy you become like, you know. So there is. There is an association to that, and I think

 

229

00:59:23.930 --> 00:59:44.579

Dr. Sindhu Joseph: there is an attempt from people of the new generation to bring back that philosophy. And I think India can lead the world in a different path. And I'm very excited about that, I think more and more people can adopt it. So I'm you know that part of what I most cherish about India.

 

230

00:59:45.510 --> 00:59:47.140

Mark Wickersham: I think we probably need more of that.

 

231

00:59:48.010 --> 01:00:14.980

Dr. Sindhu Joseph: In the in Europe. It's already, you know, very romanticized, and I can fully, you know, living in Barcelona. I can fully appreciate why that was. There is a huge appreciation for the human aspect of the existence. It's it's not, you know, the the work that defines us. I think. You know they get it

 

232

01:00:14.980 --> 01:00:24.530

Dr. Sindhu Joseph: more than anybody else in the world that you know, humans are more precious, and their appreciation for the, you know.

 

233

01:00:24.960 --> 01:00:28.969

Dr. Sindhu Joseph: just living life

 

234

01:00:29.120 --> 01:00:47.404

Dr. Sindhu Joseph: just being there, like, you know, walking down Ramblas with no intentions. Just walking in. And, you know, standing in the queue for 20 min they have no rush. So I think life happens in those small moments. And

 

235

01:00:47.830 --> 01:01:05.509

Dr. Sindhu Joseph: Europeans know how important that is, and I truly appreciated my time there. It is one of the things that I most miss being in San Francisco. So I think there

 

236

01:01:05.610 --> 01:01:18.610

Dr. Sindhu Joseph: they have figured out how to live in in certain ways. In in the Us. I appreciate a lot, the the work culture, the I think the

 

237

01:01:19.750 --> 01:01:20.610

Dr. Sindhu Joseph: professional

 

238

01:01:21.620 --> 01:01:34.490

Dr. Sindhu Joseph: values that we hold are amazing. I I still you know I have been here for 7 plus years now I'm still like, you know, trying to

 

239

01:01:36.310 --> 01:01:51.850

Dr. Sindhu Joseph: plant myself in I wouldn't say I have succeeded wholeheartedly. But I am somewhat like, you know, trying to appreciate the values of freedom the, the.

 

240

01:01:52.090 --> 01:02:09.760

Dr. Sindhu Joseph: the the opportunity that you know everybody can grow and become a valued part of the society is very appealing. But I would say I'm still conflicted in everything that is happening currently.

 

241

01:02:11.390 --> 01:02:29.449

Mark Wickersham: Well, I mean, you're certainly right in the middle of it in San Francisco, in time, in terms of you know, Silicon Valley innovation. I. I do think you look at the Us. And in that particular corner of the Us in particular, is like is a ton of risk taking innovation that that is coming out of there. It's certainly a hallmark of

 

242

01:02:29.860 --> 01:02:38.419

Mark Wickersham: of the Us. For a lot of different factors. So and then Europe, let's talk about the food, just the.

 

243

01:02:38.880 --> 01:02:44.499

Dr. Sindhu Joseph: Yeah, yeah. So I I don't don't even begin to talk about the food like, you know, you can.

 

244

01:02:45.150 --> 01:02:54.449

Dr. Sindhu Joseph: You can eat a really tasty, healthy meal in Europe. And you know, that is, that is something. Again, like, you know, it's a

 

245

01:02:54.580 --> 01:03:05.639

Dr. Sindhu Joseph: very sorely missed here, and thanks for highlighting the innovation risk taking aptitude here. I think you know, it's it's again kind of

 

246

01:03:05.790 --> 01:03:10.782

Dr. Sindhu Joseph: lost in all of the things that is happening here. But I think

 

247

01:03:12.140 --> 01:03:26.140

Dr. Sindhu Joseph: People should be like, you know, there is only one life, and you can. You can be that risk taker and live to the fullest, and I think us people knows how to do that.

 

248

01:03:26.940 --> 01:03:40.369

Mark Wickersham: Well, you're certainly a part of it, Cindy, with with being an entrepreneur and helping to make the wealth management industry better. I appreciate your time today. It's been a fascinating conversation, and thank you so much for sharing.

 

249

01:03:40.710 --> 01:03:45.319

Dr. Sindhu Joseph: Yeah, thank you so much for inviting, and I appreciate the conversation.

 

People on this episode