software development agencyTwo overlapping white elliptical loops on a black background, one solid and one dashed.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Our Articles

AI Development
How to build a data culture in the AI-powered environment
April 23, 2025
11 min read

Dr. Bange is the founder and CEO of BARC and an expert market analyst for data analytics and AI. For over 25 years, he has focused on evaluating software vendors and technologies, helping organizations make informed decisions based on market trends, strengths, and weaknesses of various solutions.

How to build a data culture in the AI-powered environment In today’s world, data is more than just a byproduct of business operations. It’s a strategic asset. A lot of organizations invest in data tools and technologies. Nevertheless, the real challenge lies in creating a culture where data is understood, trusted, and consistently used to drive decisions at every level. How to do it? That’s one of the key questions that Max Golikov, the Innovantage podcast host and CBDO at Sigli, discussed with his guest Dr. Carsten Bange. Dr. Bange is the founder and CEO of BARC and an expert market analyst for data analytics and AI. For over 25 years, he has focused on evaluating software vendors and technologies, helping organizations make informed decisions based on market trends, strengths, and weaknesses of various solutions. While BARC began as a technology advisory firm, its scope quickly expanded. Today, the company also supports clients with data and AI strategy, organizational development, and data governance. All this is a key foundation for successful analytics initiatives. A major area of Dr. Bange’s work is data culture, which is the human side of data-driven transformation. As he emphasized, even the most advanced technologies can fail without the right mindset, skills, and engagement from people. Lack of adoption is often not a technical issue, but a cultural one. To promote this idea, he launched The Data Culture Podcast. He interviews practitioners and leaders who share their experiences in building a strong data culture within their organizations. What is data culture? As Carsten explained, data culture refers to the unwritten rules that shape how organizations work with data. It encompasses the values, beliefs, and behaviors that support the effective and ethical use of data across the organization. At its core, data culture defines how people think about and interact with data, how they use it, and for what purposes. Besides that, it determines how organizations leverage data to drive decision-making, process improvement, and innovation. Data culture eats data strategy for breakfast One of the most common questions organizations face is: “How to build a strong data culture?”. Dr. Bange highlighted that “data culture eats data strategy for breakfast” as even the best data strategies fail without the right behaviors and ways of thinking. Implementing data culture is not about issuing a directive. Culture can’t be turned on or off. It must be influenced through ongoing, targeted efforts. Carsten shared his framework that organizations need to tackle when they want to improve data culture. This framework includes 6 areas. Data literacy Enhancing data literacy is often the starting point. Upskilling employees, increasing their confidence and competence with data, as well as fostering a shared understanding of its value are foundational steps. Data access Data culture depends on data access. In many organizations, it is limited either due to technical constraints or restrictive access rights. There are two models of data access. The first one is “need-to-know”. It presupposes that access must be requested and approved. The second one is “right-to-know”. It lets data be open by default unless it’s sensitive, like HR or personal information. The latter fosters trust, openness, and initiative. People have access to data and they can use it to bring benefits to their organizations. Data communication According to Carsten, communication plays a vital role in reaching people and shaping their behavior around data. To build a strong data culture, leadership must consistently communicate the strategic value of data to their employees. They should show how data aligns with business goals and supports competitive advantage. It’s also worth sharing success stories and role models. Real examples of how data has driven results, like increasing revenue or gaining new customers, can motivate others to change their attitude toward the data they have. Data strategy A successful data strategy must be based on the existing data culture. Ambitious plans for enterprise-wide AI or advanced analytics are unrealistic if employees lack the tools, skills, or access to data. Too often, strategies are overly technical. They are focused on architecture or infrastructure. But they neglect the people who must use those tools. Data culture should be an integral part of any data strategy to ensure alignment with organizational reality and to support real execution. Data leadership Strong leadership is critical to fostering a data-driven culture. While grassroots efforts are valuable, they reach a limit without top-down support. Senior leaders must actively promote data initiatives and model the behaviors they want to see. Carsten pointed out that the biggest blockers are often in middle management, where key resource and access decisions are made. If middle managers withhold support, it can slow down cultural progress. Data governance This component is about balance. Too little governance leads to chaos. Too much creates fear and resistance. Overly strict rules or legal-heavy processes can discourage people from working with data at all. Effective data governance should enable data use, not restrict it. It should guide employees, support data quality, and create clarity without driving anxiety. In a positive data culture, governance is seen as a help, not a hurdle. Who benefits the most from the data culture? According to Carsten, company size is not the deciding factor when it comes to benefiting from a strong data culture. That’s a conclusion that he has made after years of working with a wide range of organizations and interviewing nearly 150 guests on The Data Culture Podcast. Of course, large organizations often have more resources to work with data. For example, they can form dedicated data culture teams. Such teams may focus solely on promoting data literacy, leading internal communication efforts, and organizing events like annual award ceremonies celebrating successful data projects. This structured approach allows them to scale data culture initiatives across the enterprise. Smaller companies may not have formal teams but they can still adopt the same principles. While the scale and execution differ, the core concepts and framework remain fully applicable. What companies succeed in building a data culture In his discussion with Max, Carsten mentioned a strong link between overall company culture and the success of data culture initiatives. For example, organizations moving toward data products tend to succeed when their company culture already promotes collaboration, openness, and knowledge sharing. In contrast, organizations with siloed, disjoined cultures often struggle with such approaches. Among forerunners in a data culture, Carsten named Merck, a global pharmaceutical leader, that has a dedicated data-focused team. Before the first pan-European Data Culture Summit, Bange conducted a study to identify organizations actively investing in data culture roles. The research, based on LinkedIn data, revealed that: Large enterprises are more likely to assign formal roles for data culture. Europe leads globally in adopting these roles, with the UK and Germany at the forefront. In Europe, the rise of data culture roles started in around 2021, while in the US, it began 2 years later. South America is showing rapid growth and is even outpacing North America in the number of data culture-related roles. The financial services sector, including banking and insurance, is currently the most active in data culture, accounting for over half of all identified roles. This is likely due to the industry’s data-heavy nature and strong regulatory requirements for data governance and quality. Data governance in decentralized data landscapes Effective data governance must align with an organization’s structure and operational reality. "One-size-fits-all" models don’t work. Your model must be adapted based on whether the company is centralized or decentralized. Many organizations today are moving toward decentralization of structure and data ownership. This reflects a long-term trend in data and analytics: shifting responsibility and ownership closer to business units. This is also often accompanied by decentralizing platforms, tools, and access. Such a shift challenges traditional ideas of centralizing all data in one place. The once-dominant data warehouse approache, which aimed to consolidate all data centrally, is no longer practical for many organizations. The growth in data volumes, the rise of real-time IoT data, and increasing complexity make it difficult (and sometimes even impossible) to bring all data together in a single location. Instead, modern data architectures often follow distributed models, such as data fabric, which help to maintain a coherent framework for interoperability and governance. What drives data decentralization? According to Dr. Bange, the engine behind decentralization in data and analytics is the need to scale data usage across the organization. To create a strong data culture, companies need to empower more people to actively work with data and analytics tools. Centralized models often create bottlenecks either in data access or in the limited availability of central data teams. In many cases, central data teams are overwhelmed and can’t fully support the growing demand for analytics. As a result, business units need to decide whether they should wait or take the initiative themselves. Decentralization becomes a logical step here. Thanks to this approach, teams can access and integrate their own data, build local data capabilities, and act autonomously. One major benefit of decentralization is proximity to domain knowledge. Domain expertise is critical for building meaningful analytics or AI models. Being closer to the actual business processes allows teams to identify relevant use cases, involve stakeholders early, and ensure real-world adoption. This is especially important when transitioning from pilot AI projects to enterprise-scale deployment. The main challenges at this stage are often organizational, not technical. Scaling AI and analytics requires changing workflows and embedding new tools into existing processes. All these issues can be addressed faster when data teams are integrated within the business units. Benefits of the hybrid approach However, entire decentralization is often not the best choice. Here is when hybrid models come into play. Hybrid models offer the most practical and scalable path forward for organizations navigating data governance. Dr. Bange explained that this approach strikes a balance between central oversight and decentralized autonomy. It means that it can adapt to organizational complexity while enabling growth. There are two key reasons to centralize certain aspects of governance. First of all, some topics are too critical to leave to individual teams. Regulatory compliance, such as GDPR, is a prime example. Instead of having dozens of teams interpret and apply these rules independently, centralized governance ensures consistency and reduces risk. Secondly, limited expertise in emerging areas like AI often requires a centralized starting point. Over time, as capabilities mature, these roles can be gradually decentralized and central units are shifted to supporting roles, like education and community-building. At the same time, organizational diversity plays a crucial role. Within the same enterprise, different departments or regions can be at vastly different levels of data maturity. Some may have strong internal teams, platforms, and domain expertise. Others may rely heavily on centralized support and shared services. A hybrid approach acknowledges such differences. It allows flexible service models, where units can choose what they handle independently and what they consume from central teams. AI’s influence on data culture and governance The rise of AI has significantly shifted the conversation around data in organizations. What was once a specialized concern for data teams has now reached the boardroom. Executive leadership increasingly recognizes that AI requires high-quality, well-governed data to deliver real value. This understanding has reinforced the need for robust data governance practices. As companies aim to expand their AI capabilities, they must also address long-standing challenges around data quality and accessibility. The roles of data and AI literacy are equally important across the organization. Just as with broader data culture efforts, successful AI adoption requires behavioral and mindset shifts. Employees must understand what AI is, how to use it, and feel empowered to experiment with it. Access not only to quality data but also to AI tools and infrastructure remains crucial. Making AI capabilities widely available within the organization democratizes innovation but also increases the importance of governance frameworks to guide ethical and compliant usage. The introduction of the European AI Act underscores this point. While some organizations view it as restrictive, others see its value in providing clarity. With it, companies have received a stable framework within which they can build and scale AI-powered vs. traditional approach to data When it comes to becoming data-driven, organizations face a common dilemma: should they fully rely on large language models and hope that AI is smart enough to help them work with data, or should they take a more conservative route and focus first on cleaning and organizing their data? Dr. Bange believes the real challenge is doing both at the same time. AI often acts as a trigger for companies to finally take a closer look at their data. Poor quality, outdated models, and years of underinvestment in data infrastructure are typical issues. Ideally, organizations would fix their data first and then build AI use cases on top. But that approach isn’t realistic in a fast-moving environment. Nobody wants to hear that leveraging AI requires two years and several million euros just to clean the data. According to Carsten, it could be sensible to opt for a more pragmatic approach: find use cases where AI can deliver early value while simultaneously improving the data foundation. Such projects can demonstrate the potential of AI. They also provide time to make the necessary long-term investments in data quality. Challenges of data culture and AI implementation There are two major blind spots for organizations trying to implement data culture. The first one is the human element. Amid the excitement around new AI models and technological advancements, companies often don’t notice the central role of people. As AI automates more tasks, the need for human oversight and engagement becomes even more critical. Building a strong data culture isn’t just about tools. It is also about collaboration, and continuous learning. The second blind spot is underestimating the speed of technological change. Many organizations lack a clear grasp of how rapidly AI is evolving. This can make them slow to adapt or experiment. As a result, they may fall behind their more agile competitors that embrace AI-driven automation and innovation more quickly. Practical advice for technology leaders At the end of their talk, Max asked Carsten to share recommendations on how to start building a data culture at an organization. The first tip was quite simple: just to start. Too many organizations hesitate or overthink the process. However, taking action is vital. He also recommended using a structured framework, such as his own model with six key areas that influence data culture. This framework helps organizations assess where they currently stand and identify which aspects need the most attention. Dr. Bange also mentioned two areas that are often underestimated at the beginning of the journey: data access and data communication. Many companies don’t realize their importance until they are already a year or two into the process. And this can become a serious obstacle for them. Want to get more expert insights into how to boost your business growth in the data-driven world? New Innovantage podcast episodes will shed light on this! Stay tuned!
Product Management
AI and Tech Due Diligence: What businesses and investors should know
April 15, 2025
10 min reqad

Agu is a Co-Founder and Partner at Intium Tech, a tech advisory firm specializing in helping large companies and private equity funds buy and sell tech businesses. Over more than 20 years of his professional journey, he has accumulated experience in such spheres as development, architecture, and executive leadership. All this helped him to get a good understanding of how the tech world works. Seven years ago, he transitioned into consulting, helping businesses with acquisitions, carve-outs, and value creation.

Every episode of the Innovantage podcast offers a new perspective on different business aspects and the role of technologies in them. This time, Max Golikov, the podcast host and the CBDO at Sigli, invited Agu Aarna to talk about tech due diligence and the impact of AI on the investment landscape. Agu is a Co-Founder and Partner at Intium Tech, a tech advisory firm specializing in helping large companies and private equity funds buy and sell tech businesses. Over more than 20 years of his professional journey, he has accumulated experience in such spheres as development, architecture, and executive leadership. All this helped him to get a good understanding of how the tech world works. Seven years ago, he transitioned into consulting, helping businesses with acquisitions, carve-outs, and value creation. In 2021, he co-founded Intium Tech. With Intium, Agu and his team wanted to create a standardized approach to assessing technology, similar to what exists in other sectors. They recognized the need to describe technology in a clear, structured way for investors and business leaders. As they developed their system, they realized it could be integrated into software. This led to the creation of their own platform, which enables more efficient analysis of acquisition targets. How technology affects business In his dialogue with Max, Agu emphasized the complexity of technology’s impact on business. A minor technical detail can have significant business implications. That’s why assessing its true effect is crucial. Blindly following best practices is not the best approach. The focus should be on understanding their relevance to a company’s goals. For example, if a company doesn’t run unit tests, it’s not just about missing a best practice. First of all, it should raise questions about the quality of its solutions, leadership, and overall strategy. It’s necessary to find out why it is so. According to Agu, the key lies in finding a balance and understanding both the business’s ambitions and how technology can support them. This dynamic relationship between business goals and technology is what he finds most important. Challenges in tech due diligence Tech due diligence (TDD), which is one of the core aspects that Agu’s firm is focused on, is a detailed examination of a company’s technology infrastructure, products, and processes, typically conducted before a merger, acquisition, or investment. As Agu highlighted, the approach to such analysis has evolved significantly over the years. In the 2000s, it was viewed as a “nice to have” process. It presupposed that a couple of tech experts would assess a company’s technology, often resulting in a laundry list of issues based on their own expertise. This approach lacked a comprehensive view of the business impact. By the 2010s, tech due diligence had become more professional. It already could offer a broader perspective on leadership, architecture, and infrastructure. However, the analysis still lacked a focus on the actual business impact of these issues. In the 2020s, the focus shifted to understanding the business impact of technology and analyzing companies from this perspective. However, inconsistencies in reporting remained a challenge. Different experts can emphasize different aspects, which leads to varying results. Such an issue highlighted the need for a standardized approach. How to make TDD more efficient today Agu believes that to solve this, the industry needs more consistent, high-quality analyses. This could be achieved by leveraging software instead of relying on people-driven processes. This shift toward software-powered solutions, like the one developed by Intium, aims to provide a more scalable and smooth approach to tech due diligence. When discussing tech due diligence, Agu also highlighted two key aspects to focus on. First, it’s crucial to educate clients that tech due diligence is more than just a code or architecture review. Technology is the engine that powers a company, but just like a car, it needs to be steered in the right direction. Evaluating technology requires understanding its context within the business, not just identifying flaws in infrastructure or architecture. Equally important are the people managing the technology and the processes that connect them. Inefficiencies here can quickly undermine technical strength. The second key aspect is taking a comprehensive 360-degree view of the company. Concentrating on only one part of the technology or business won’t provide the full picture. Without this broader perspective, risks and crucial elements to make the deal successful might be overlooked. Moreover, Agu identifies several key risks in tech due diligence that can lead to failed deals: One major risk is when technology is presented as a core asset but doesn’t live up to expectations. Another risk is technical debt and architecture. If the debt requires too much effort to manage or fix, it can cause a deal to fall apart. A third risk is insufficient preparation for the sales process by the target company. When private equity firms are considering mature companies, a lack of proper preparation can reveal too many unknowns, making the deal seem too risky. A well-conducted TDD not only helps determine whether to buy this or that company but also provides information to negotiate the price, impose conditions in the purchase agreement, and even structure earnout plans. Key factors investors should pay attention to It is believed that when you are investing in tech businesses, technology always remains the key factor to evaluate. However, this is not always true. Agu explained that in early-stage investments like seed, pre-seed, and Series A or B, technology is often secondary (as at such stages there is hardly any tech at all). What investors are looking at are the ideas and leadership. Investors should focus on exploring whether the leadership team understands the technology they are working with. Here, the key task is assessing the leadership’s technical acumen to ensure they can build and execute on their vision. As companies move into the growth phase, product-market fit is already established. It means that technology becomes crucial. Scaling the technology to support growth is a different challenge from proving a market problem. This makes tech due diligence more important at this stage. In private equity, where mature companies are involved, technology is already a significant factor. Agu stressed the importance of being transparent and truthful when communicating with investors. If a company misrepresents its technology or misleads investors, it can result in the collapse of the entire deal. AI wrapper companies: Good or bad? While talking about tech innovations, Max mentioned the growing number of so-called AI wrapper companies. They build user-friendly interfaces or apps on top of existing AI technologies, often providing a simpler or more tailored experience for end-users. Instead of developing their own AI models or deep technologies, these companies focus on wrapping AI capabilities into practical solutions. They interact directly with users and often become "sticky" due to people’s habits. Agu believes there is nothing wrong with establishing a wrapper company. In fact, being a wrapper company can be even more important than being a deep tech innovator like OpenAI. He pointed out that AI wrapper companies need to work in some specialized areas like prompt engineering, which may not require deep tech knowledge but still involve particular skills. These companies must know how to effectively augment prompts and optimize user interaction. He also noted that developing and hosting AI can be expensive, adding another layer of complexity for companies in this space. According to Agu, building your own AI is not impossible. However, convincing investors that the team has the expertise to do it is challenging as AI can be very technical. When evaluating an AI company, it is crucial to determine if AI is truly the right tool for the indicated problem. For example, traditional mathematical or statistical models may work as well as AI in some cases, and using AI unnecessarily could signal a lack of understanding of the problem. However, in competitive markets, simply being a wrapper around AI isn't enough. Teams behind such projects must specialize in and understand how AI works. This is also necessary to choose whether they will use off-the-shelf solutions or develop their own models. Privacy is another major concern, particularly in regions like Europe, where data protection is strict. In some cases, companies opt to develop their own AI in order to avoid privacy issues with third-party systems. Impact of AI regulation and privacy laws AI regulation and privacy laws, such as GDPR, have sparked significant debate. Nevertheless, over time, they have proven to be pretty manageable and even beneficial. For instance, GDPR served as a template for other laws like the CCPA in California and the UK’s data protection frameworks. These regulations were initially seen as hurdles but now they are generally accepted as necessary for privacy protection. There is a concern that regulation can stifle innovation. This can happen not necessarily due to any created barriers, but due to the lack of input from business and tech representatives during the drafting process. A more collaborative approach that includes industry experts can make the regulations much more balanced and practical. Regulations are important for protecting personal data. It is crucial to remember that not all market players have good intentions. Without regulation, the misuse of personal data, especially in AI training, could lead to manipulation on a massive scale. Proper regulation ensures that the technology benefits society without being exploited. Policies serve as a tool to raise awareness and guide behavior. They are like a friendly reminder to look both ways before crossing the street, providing useful information that helps keep us safe. When viewed in this light, regulations aren’t obstacles but safeguards that help us navigate potential risks. As AI and technology continue to connect us more deeply, establishing ground rules becomes essential. These rules will help define what data can be used and under what circumstances, ensuring that people are not overwhelmed by the complexities of these technologies. With proper guidelines, people can better understand and trust the systems in place. This clarity is vital for preventing confusion and misuse as the tech landscape evolves. Future of AI for investors These days, there are a lot of talks about the role of AI in different industries and domains. That’s why Max couldn’t help but ask Agu to share his vision of the role of AI in tech due diligence. AI is already being used by investors, particularly in early-stage analysis. Today investment firms leverage AI to gather data on potential companies, analyze it, and automate certain tasks. For example, AI can notify investors when a company becomes more lucrative to drive further investigation. Investors can also use advanced tools like ChatGPT to ask AI for advice about companies. AI plays a significant role in the early stages of investing, and its use extends to later stages and new purposes. However, relying entirely on artificial intelligence without expertise can be risky. If you input a company’s documents into AI, like OpenAI’s ChatGPT, and ask for a summary of the top issues, the technology may provide a polished response that seems accurate but could be misleading. This is because AI sometimes hallucinates and fills in gaps with logical but incorrect information, leading to wrong conclusions. This can be especially problematic for non-experts who might be misled by the polished language. AI is particularly useful in summarizing large amounts of data. But it should always serve as a tool to support expert analysis, not replace it. The key is using AI’s output as an input to the expert’s thinking while controlling that AI doesn’t miss important details. This approach allows for more accurate and reliable results. AI has made significant progress in assisting with due diligence. However, it is still not at the point where it can fully conduct the process on its own. Connecting AI findings to the investment thesis and business impact remains a significant challenge. While AI can provide valuable insights, human expertise is required to make sense of AI-generated data in a meaningful way. In the future, AI may gradually take over more tasks, with humans focusing on areas where AI struggles. However, a key challenge will be ensuring that AI systems continue to evolve. They need constant feedback to stay updated with new information, trends, and market shifts. Without this ongoing learning, AI may become outdated and far less helpful. Investment opportunities and trends in the tech market While talking about the current investment opportunities, Agu noted that in recent years, many specialized startups have emerged. What makes them successful is their focus on niche products that effectively solve specific market problems. According to Agu, today a lot of private equity accounts are sitting on a significant amount of dry powder, which means that there is capital ready for immediate investment. This situation suggests that a period of consolidation may be on the horizon, where smaller companies are acquired and merged into larger corporations. This trend is likely to create opportunities for venture capital and growth equity investors who have supported these niche companies. In particular, AI wrapper companies, if they solve a real problem and maintain strong customer relationships, are well-positioned in this environment. In conclusion, Agu agreed with the common opinion that AI is here to stay. It is expected that this domain will become increasingly efficient over time. We will likely see the emergence of more advanced AI use cases and implementations. However, all of these AI systems will still require resources to operate. Therefore, anything that powers AI is likely to remain essential moving forward, which is a quite expected trend. And if you want to learn more about the current and future trends in the business world, the Innovantage podcast is exactly what you need. The next episodes will be available soon (moreover, don’t forget to verify whether you haven’t missed the previous ones)!
Product Management
What is the secret of startup success?
April 1, 2025
10 min read

Every startup founder wants their business to achieve success. But does every startup founder have the required traits that will lead their business to success? This was one of the questions that Max Golikov, the Innovantage host and Sigli’s CBDO, addressed to his podcast guest Mike Sigal.

Every startup founder wants their business to achieve success. But does every startup founder have the required traits that will lead their business to success? This was one of the questions that Max Golikov, the Innovantage host and Sigli’s CBDO, addressed to his podcast guest Mike Sigal.Mike is an expert with over 35 years of experience as both a founder and investor, who is now a founder of Sigal Ventures, a Venture Partner at GPO Fund, MiddleGame Ventures, and Pella Ventures, and serves on the Investment Committee for SC Ventures. During his professional journey, he has seen the peculiarities of the entrepreneurship world from different perspectives. In his conversation with Max, they also discussed the current state of the fintech market, the challenges of the VC industry, and the value of resilience in the business space.Entrepreneurship is a force for goodOver the years, Mike founded or co-founded eight startups, varying from graphic arts, cloud databases, analyst firms, software, and fintech, to a nonprofit. His journey included raising venture capital, experiencing both successes and failures, and serving as an executive at a company through an IPO. He has been through the entire startup to exit journey multiple times. Between startups, Mike consulted for mid-to-large corporations, which led him to work with SWIFT. There, he helped bridge the gap between startups and global banks, creating a competition that introduced fintech unicorns like Wise and Revolut to the industry.Among his other career milestones, he was also invited to join 500 Startups (now known as 500 Global) as an Entrepreneur in Residence. In this role, he helped them to build their fintech acceleration program and became a General Partner of their Fintech Fund.The COVID lockdown became a turning point for Mike. By 2019, he already knew that being a VC wasn’t what he loved most while working directly with founders was.Before the pandemic, his role meant constant travel. His tasks and responsibilities included keynoting conferences, meeting investors, connecting with entrepreneurs, and exploring startup ecosystems worldwide. But when the world shut down, that part of the job disappeared.Mike was forced to slow down and reflect. At that time, he realized what truly mattered: helping others. With decades of experience, he decided to shift his focus from investing to coaching founders and fund managers.According to Mike, entrepreneurship is a force for good. Supporting those building the future became his most fulfilling work.How not to take the wrong pathMike believes both entrepreneurship and venture capital require thinking in long-term cycles, which are often 10 years or more. A single VC fund takes years to raise and another 7 or 10 years to run. As for a venture-backed startup, it typically needs the same timeline to get to liquidity.According to Mike, before diving in, future founders and investors should ask themselves whether they truly love the journey enough to commit a decade (or even more) to it. Success in either path isn’t about the next quarter or year but about embracing those long cycles.For Mike, the way to stay balanced in a professional life is to focus on what brings daily joy.Mike emphasizes the importance of regular self-reflection as a discipline, whether daily, weekly, or monthly. He compares it to customer development, but in this case, you are the product. The process involves asking:Where am I trying to go?What am I learning?What new questions are surfacing as I grow?Just like talking to customers reveals insights, reflecting on your own journey helps uncover where the real value lies. Mike believes this practice builds both confidence and clarity, not just for yourself but for anyone you mentor.Only we ourselves are responsible for our own growth. If we are not controlling our own personal and professional development, then who is?What constitutes a good founder?Mike decided to explore what traits VCs look for in founders and turned to artificial intelligence to help him. He asked ChatGPT to focus specifically on insights from seasoned investors who have backed unicorns. He formulated four key questions:What founder traits do VCs value most and why?What traits make the biggest difference in entrepreneurial success?What early-stage behavioral or psychological signals indicate potential? What tools can help surface those traits?What are the red flags?The exercise highlighted five top traits VCs commonly seek:Visionary leadership. It is the ability to see some versions of the future and inspire others. ChatGPT offered Elon Musk and Steve Jobs as examples of founders with this trait.Exceptional execution. This is a skill of turning vision into reality. Jeff Bezos is a person who has such a skill.Resilience and grit. These traits presuppose the ability to push through setbacks. Here, ChatGPT named Brian Chesky of Airbnb.Deep market insight and domain expertise. They are crucial for disrupting industries. Melanie Perkins of Canva was mentioned here.Adaptability and fast learning. These traits demonstrate your skill of being agile and pivoting quickly when needed. According to ChatGPT, Stewart Butterfield of Slack possesses these traits.Just for fun, Mike also gathered common red flags that cause VCs to pass on deals. He shared them in a room full of VCs and founders and asked them to raise their hands if they had ever rejected a deal for each reason. Every hand went up each time. Here are some of the positions included in the list:Unbalanced teams (all technical or all business team members);Frequent staff turnover;Founders lacking self-awareness;Romantic relationships between co-founders;Founders working on too many projects;Founders with narcissistic tendencies.While discussing this topic, Mike mentioned the research from Defiance Capital, which studied 2,018 unicorn founders in the US and Europe from 2013 to 2023. The findings revealed three common drivers behind unicorn founders’ success:No plan B. These founders were all-in. There was no safety net or fallback. For them, failure wasn’t an option.A chip on the shoulder. They had something to prove, whether to themselves, the world, or both.Unlimited self-belief. They truly believed they could make it happen, no matter the obstacles.According to Mike, these traits often separated unicorn founders from the rest. And namely, they can also be mentioned in the context of another, even broader notion. It is resilience.The power of resilienceWhile talking about resilience and its role in business, Mike mentioned Hummingbird Ventures, one of Europe’s top-performing venture funds. This fund is known for its unique thesis: they invest primarily in founders who are neurodiverse or trauma survivors. This approach is based on the belief that these people see the world differently and possess exceptional resilience.Founders who have overcome extreme challenges (it could be growing up in war-torn regions or rising from refugee camps) often develop the inner strength needed to navigate the tough journey of building a company. Hummingbird sees that experience as a competitive advantage.Learning from failureWhen it comes to failure, its role (and the value of lessons learned) shouldn’t be underestimated.Mike shared a personal story about his fear of public speaking. Early in his career, after his startup was acquired, he found himself a senior executive leading product and technology through an IPO. At his first major organizational meeting, surrounded by lawyers, bankers, and other executives, he froze.At that moment, he realized that he could easily let his team down because of his fear. That became a turning point. From then on, he committed to improving his communication skills, especially in high-stakes settings.Such failures could be very painful. But they often become a catalyst for growth, especially in corporate environments where failure is less tolerated than in the startup world.You can’t control how quickly the market or the world around you changes. That’s a given. The real question is: how fast and how efficiently can you learn? If you, as an individual, a team, or a company, can learn faster and cheaper than the others, your chances of winning go way up.What truly makes the difference is your ability to learn from mistakes. If you can minimize the cost of those mistakes while maximizing the speed of learning, you naturally start moving faster than everyone else. And that’s where the edge comes from.The fintech industry todayIn 2021, the global financial services industry represented about $12.5 trillion in market capitalization. Out of that, only around 2% was fintech. Projections suggest that by 2030, the industry will grow to $22 trillion. But fintech will still only make up about 7% of that total.There’s still a long way to go before financial services are truly transformed by modern technology. Yes, many financial institutions already use technology. However, a lot of solutions are 40 or 50 years old, built on legacy systems that weren’t designed for the digital age.It’s also worth noting that financial services remain one of the most profitable industries on the planet, with gross margins of 18%. That translates to roughly $2.3 trillion in annual profit up for grabs. This is an enormous opportunity for entrepreneurs and investors.If we take some comparatively simple things like retail and small business savings accounts or sending money internationally through platforms like Revolut and Wise, a lot has been already done.Emerging trends in the fintech worldWhat is coming next in the fintech space is much harder but also much more interesting. Technologies like AI, embedded finance, and finally, a clearer global regulatory framework are maturing and could reshape the industry.AI in fintechIn the context of using emerging technologies in fintech, Mike mentioned the findings of the Bank of America research. The research revealed that over the last 20 years productivity across S&P 500 companies skyrocketed. Specifically, the number of employees required to generate $1 million in revenue dropped from about nine people to just over one.And that was even before generative AI tools like ChatGPT became widely accessible.Just imagine how many people global financial institutions employ and then think about the productivity gains that AI could unlock. It can give you a hint of the scale of change that might be coming.TokenizationAnother concept that, according to Mike, looks quite promising is tokenization.When considering tokenization, it is important to set aside speculative crypto. This isn’t about meme coins or hype-driven tokens. Instead, the focus is on real-world assets, like buildings, infrastructure, and commodities.Today, there are an estimated $475 trillion or even more in real-world assets globally. The vast majority of this is still managed through paper-based processes and PDF documents.Digitizing these assets and automating their management could dramatically improve efficiency. Furthermore, tokenization would allow these assets to be fractionalized into much smaller pieces, enabling access to investment opportunities that were previously out of reach for most people.For example, people in Sub-Saharan Africa could invest in a fractional share of Apple or own a small piece of a revenue-generating office building in London.If regulated by strong, modern frameworks, tokenization could unlock a more inclusive and efficient global financial system, where access to high-quality assets is democratized on a global scale.Embedded financeThe idea of embedded finance is something that Mike really likes, particularly in terms of its potential to drive growth in emerging markets. However, he believes that the path to growth in such regions doesn’t solely lie in increasing venture capital investments. While many may advocate for more VC funding, he thinks, the true opportunity lies in deploying more debt into emerging markets.At present, major institutions like the World Bank, IFC, Goldman Sachs, and others are limited to operating with large-scale debt investments, typically in the billions of dollars. This is largely due to the high costs associated with underwriting and the profitability goals these organizations are trying to achieve.The challenge, according to him, is that these large institutions are constrained by the size and scale of their debt, which doesn’t always meet the needs of smaller, more localized markets. At the same time, these markets could greatly benefit from more accessible, tailored financial solutions.Embedded finance could act as a bridge to solve this issue, offering scalable, more adaptable solutions to drive growth without being confined by traditional financial models.VC cycles and key startup challengesWhen it comes to corporate VCs, there are several things they need to be mindful of when looking at the market, founders, and potential unicorns. One of the biggest challenges they face is the timeline mismatch between startups and corporations.Startups often operate on rapid timelines, moving quickly to develop products, secure investments, and scale. For example, a startup may be able to code and pitch a product in a matter of weeks or months. However, corporations typically work on annual or quarterly cycles. As a result, it becomes much more challenging for them to move at the same pace.This timeline mismatch becomes especially evident when a corporate VC is looking to make a major technology investment. The process within a corporation can take a significant amount of time (perhaps 18 months) to make an investment decision, another 18 months for procurement, and another 18 months for deployment. But even within the first 18 months, a startup may not survive due to lack of funding or any other factors.Corporations, on the other hand, often expect to see a return on investment within a couple of quarters or a year. However, early-stage startups typically require a 7 to 10-year horizon before they can generate liquidity.This disconnect between the timelines and expectations of startups and corporations creates significant challenges for both sides. To successfully collaborate with startups, corporate VCs need to recognize these challenges and adjust their expectations. This will help to avoid misunderstandings and missed opportunities.As you can see, there is still a long way to building an ideal environment in this space. Nevertheless, it is precisely such challenges that forge founders and help them reach new heights.The tech world and the startup ecosystem are highly dynamic. Therefore, the ability to adapt and learn from mistakes in the shortest possible time remains one of the most important priorities on the path to success.If you want to know more about what is happening in the tech industry and understand the trends shaping its future, don’t miss the upcoming podcast episodes, where Max Golikov and his guests will continue sharing inspiring insights.
Generative AI
Exploring AI's Role in Education with Oxford's Dominik Lukes
March 23, 2025
15 min read

Max Golikov speaks with Dominik Lukes about AI's impact on education, discussing how tools like large language models are transforming research, homework, and learning processes while addressing challenges like AI hallucinations.

AI has become a buzzword. But do we really know a lot about it? Can we fully leverage the new opportunities that it brings to us? To dive deeper into this topic and to make this space more transparent to everyone, we’ve launched the Innovantage podcast. In the series of episodes, Sigli’s CBDO Max Golikov will talk to AI experts who will share their professional opinions on how AI is transforming the world around us.Our first guest is Dominik Lukes, System Technology Officer at Oxford, who runs the Reading and Writing Innovation Lab. Dominik has been exploring the potential of artificial intelligence since the early 90s, long before the world got familiarized with ChatGPT.In this episode of the Innovantage podcast, Max and Dominik discussed the impact of AI on the education sector and its potential to revolutionize the academic environment. Moreover, they touched on the basics of generative AI, the working principle of LLMs, and even the probability of an AI apocalypse.Check out the full Innovantage episode with Dominik Lukes here: Have no time to watch it now? We’ve prepared a short summary for you!Key terms that you need to knowTo begin with, let us briefly explain the main terms that are related to the topic under consideration.Any AI tool is based on a model and a model is a set of parameters. Namely, these parameters ensure that if you feed something into a model, it will give you something back.The models that are used in ChatGPT and similar solutions are models that generate language.What are LLMs?But what does it mean when we say “Large language models” (LLMs?) What makes them large?A free version of ChatGPT that is available at the current moment relies on a corpus of about half a trillion words or thereabouts, which is an enormous number. As for GPT-4, OpenAI hasn’t revealed precise figures. But when Meta released a new large language model called Lama3, they said it was pre-trained on over 15T tokens that were all collected from publicly available sources.The bigger the corpus used for pre-training is, the higher quality you can expect.There are also parameters that should be applied to make models work. There are some small models with 8 billion parameters, while large models have hundreds of billions.Why do you need to pre-train AI models?An interesting thing that was an important breakthrough in AI is that it is not necessary to train AI for every single task separately. You can take all these 15 trillion tokens and pre-train a model with some basic cognitive capabilities.After pre-training, it’s time for fine-tuning on top of that which will make your model do other things. The companies are constantly fine-tuning the models. That’s why the models are changing. For example, there is a thing that worked last month. But it may not work this month.To achieve the desired results the data used for pre-raining has to be clean, has to be carefully selected within the abilities of your algorithm. Your model has to be pre-trained and then fine-tuned for a particular purpose. And that’s one of the things that will make your solution work better.How do AI models work?The work of models can be compared with a regression curve, which is kind of a prediction curve. While there is an opinion that such models work on frequencies and occurrences, that’s true. What they have inside are weights and relationships.Dominik compared such models with semantic machines. So they are semantic in the sense that they understand relationships between things, but they’re not semantic in the sense they don’t understand the world outside themselves.GPT: What is it?Have you ever thought about what this abbreviation can mean? Actually, these three letters stand for what we’ve just explained about the work of such models.G is for Generative. It means that the model is capable of generating text.P is for Pre-trained. It means that the model should be pre-trained on a large corpus of data for learning patterns, grammar, facts about the world, and getting some reasoning abilities.T is for Transformer. This refers to the underlying architecture of the model used for natural language processing.AI hallucinationsIf you have ever worked with LLMs, you’ve probably noticed that sometimes they can provide inaccurate answers or “invent” something that really doesn’t exist. It means that these models can still hallucinate despite massive improvements.It can happen because models are trained on data. They learn to make predictions by finding patterns in the data. Nevertheless, due to biased or incomplete data, your AI model may learn incorrect patterns which will result in wrong predictions.How to make AI work betterUnfortunately, AI models can’t teach us how it is correct to communicate with them. And let’s be honest, interacting with AI is not just the same as communicating with a person.You should be ready for a ”rollercoaster”. It means that sometimes AI tools can go much beyond your expectations, while sometimes its outputs may be disappointing for you.To achieve better results, you should experiment, try different prompts, and elaborate your own approach to make AI solve your tasks.Not by ChatGPT alone: AI-powered tools that are used nowAI Tools Used at Oxford by Dominik LukesWhen ChatGPT was made publicly available in November 2022, it caused enormous hype practically immediately. Let’s be honest in mass perception, ChatGPT has become a synonym for generative AI. Nevertheless, that’s far from being true. Today there is a huge number of various tools, the functionality of which can greatly differ from what ChatGPT offers.First of all, you can start your familiarization with AI with the so-called Big Four. Apart from ChatGPT by OpenAI, it also includes:Claude by Anthropic;Gemini (formerly known as Bard) by Google;Copilot by Microsoft.People take these popular models and use them to build different tools.For example, Elicit is such a tool that can help you with your research. It can search for papers and extract information from them. Of course, you still will need to check it but you will get a really good draft.There are also projects that leverage the possibilities of the released coding IDE of GPTs. It allows people to create, for example, custom bots within ChatGPT or Copilot.By using the APIs, it is possible to build solutions outside of these platforms.According to Dominik, currently, we are at the stage where everybody is trying to see what AI can do for us now. But also we are starting to explore what it can do for us in the future and what are the possibilities.Such a highly respected educational institution as Oxford is also actively discovering the potential of AI, along with the rest of the world. Dominik shared that they are experimenting with ChatGPT, its enterprise version, integrations with Copilot, as well as other innovative tools powered by AI.In this case, for researchers, it’s highly important to understand what students think about various solutions, what they find useful, and how they can benefit from the integration of AI into the learning environment.Dominik also shared his personal thoughts. According to him, Claude is a good tool for educational purposes. It can deal with long context. It means that you can upload the entire academic paper and ask it to provide you with a summary or to find some specific information in the text. This feature makes Claude different from ChatGPT. And it can be highly helpful not only for students or professors but also for businesses.Homework is dead. But what about education itself?When it comes to education and the changes that AI has brought to it (and will bring in the future), a lot of people are concerned about the possibility of checking the level of students’ knowledge. And their position is quite clear.For example, earlier the format of home exams was rather popular. Students received tasks and were asked to do them at home. Now, when we have so many AI-powered tools at hand, such tasks can be fully useless.It’s obvious that you can no longer pretty much trust that all students who will hand in their essays have written their works entirely on their own. Such things as composition, spelling, grammar, and some other objective points that professors take care of can be checked and improved by AI solutions. Of course, they are still far from being perfect when it comes to research and in-depth analysis. However, that’s something that we have on the horizon.Some teachers try to apply so-called AI checkers that are expected to detect AI-generated content. Nevertheless, AI experts insist that today there aren’t any reliable tools that can identify such content with 100% precision. There are different big and small models and they generate content in different ways. Moreover, their outputs greatly depend on prompts. As a result, we can’t trust the results shown by these checkers.How AI is integrated into the academic process at OxfordBut how can professors motivate students to learn new materials if even their homework can be done with the help of artificial intelligence?Professors at Oxford have their own approach to the academic process that can be a good solution for many educational establishments. A big part of the educational activities are happening in small groups. It means that students have a lot of discussions. So when they submit papers, they also have to talk about them afterward.As for exams at Oxford, a lot of their examinations take place in an invigilated environment. So professors can see what the students are using.Dominik is quite optimistic about the integration of AI into the education process. Though it’s too early to speak about its mass adoption its further implementation will definitely continue. And the task for both educators and students is to find the best way to use artificial intelligence for their needs.AI for teachers: How to use it nowMax and Dominik also talked about the use cases when teachers can apply AI already now.Here, Dominik shared one simple principle of working with AI solutions: You should ask the right thing from the right tool. For example, ChatGPT can be really good at explaining math terms and concepts but it is really bad at calculating and solving math tasks.Similar things can be observed in other disciplines. Language teachers can greatly benefit from the ability of AI to create multiple-choice tests for students about a text or a grammatical feature. And here AI can perfectly cope with such tasks.Nevertheless, if you are going to ask an AI model to create fill-in-the-blank grammar exercises, you shouldn’t have any high expectations. In this case, AI can offer the wrong option or provide the wrong gaps where something should be added. Quite often, if you ask AI to give you an example of a grammar feature, you will get an answer that won’t satisfy you. But when AI is generating a text for you it won’t make such mistakes.AI generation still requires strong human supervision, just like an intern. It can work for you but you still need to control the provided results.Skills for future students to work with AIThe educational environment is changing. How can we get ready for this AI-enriched world? Are there any specific skills that people should try to develop in order to work better with the newly introduced tools?While answering these questions, Dominik highlighted that it is impossible to name any precise skillset.However, here’s a list of recommendations from a person who has been working with AI for many years:Keep exploring.Keep trying it.And do not think that if you have used an AI a few times you have explored the entire frontier of its capability.Maybe in a year or two, professionals will find some skills that you need to know but not now. There isn’t just one best tool or the best skillset to be used in the academic environment, as well as in any other space.AI for disabilities: Can it help people to overcome barriers?Speaking about AI, it is also interesting to note the potential of such tools to change the quality of life for people who have different types of disabilities. And here, it’s worth paying attention not only to what such solutions can offer in the educational context but also in the context of everyday tasks.Such tools as screen readers or text-to-speech solutions can be highly useful for people with low vision and different kinds of visual impairments. It is possible to take any webpage and ask AI to voice what is written or shown there. In other words, even if a person can’t read or see something on his or her own, AI can do it. Of course, inaccurate outputs caused by AI hallucinations are still possible. But that’s already a great step forward.AI can also be of great help for those who have problems with writing and typing due to dyslexia or any other issues. In this case, people can rely on speech-to-text features, as well as AI-powered grammar and spelling checkers.Given this, we say that artificial intelligence can make a lot of things available to people, even if previously they couldn’t do them.Talking about the capabilities of AI to expand the existing borders for people, Dominik also mentioned that today not speaking English is already a huge limitation. Those who do not know this language are cut off from a huge part of the world, especially when it comes to learning. A lot of materials are provided only in English. And here AI can also demonstrate its power. You do not need to wait till this or that research is translated into your native language. You can ask AI to do it for you and get a quick result.And…Is an AI apocalypse inevitable?Let us be fully honest with you. That’s just an eye-catching subheading. While some people are trying to guess what is going on in GPT’s mind, such experts as Dominik already know the answer. Nothing. Really nothing is going on in GPT’s mind till the moment we send a question to the chatbot.We are learning constantly, even when we are sleeping our brains are changing.Large language models, as well as other AI-powered tools, can’t think as we can. They are not exploring the world around them. If there are no requests from users, such models are sitting quietly just like a blob of numbers on your hard drive. It means that we should feel completely safe.Instead of the final wordThe AI industry is advancing at an enormous pace. Even a couple of months can bring impressive changes, and half a year feels like a leap into a new era. That’s why it’s practically impossible to predict what comes next and when. So let’s wait and see how AI tools will evolve soon and how education and other spheres will be impacted by these changes.Looking for more insights from the world of AI? Follow us on YouTube, like our videos, ask questions in the comments, and do not miss the next episodes of the Innovantage podcast hosted by Max Golikov.Subscribe to Innovantage YouTube ChannelCheck Innovantage SpotifyListen to Innovantage on Apple Podcasts‍
Product Management
Startup Journey: Tech Business Growth and Role of Fractional CTOs in It
March 17, 2025
9 min read

How can tech startups survive today? How to find a good idea that will rock the market? Who can help you to guide your team if you have a limited budget?

How can tech startups survive today? How to find a good idea that will rock the market? Who can help you to guide your team if you have a limited budget?To discuss these topics, Innovantage podcast host Max Golikov, who is also the CBDO at Sigli, invited Laimonas Sutkus to join him in his studio. Laimonas is a person with robust expertise in helping businesses launch their projects and manage tech teams in such highly competitive fields as AI, fintech, health tech and others.In his career, he has gained experience as a software developer, tech advisor, CTO, and fractional CTO, working with businesses at different stages of their development. In this episode of the Innovantage podcast, Laimonas spoke not only about his professional path and the peculiarities of the tech industry landscape today but also shared valuable insights and practical recommendations for startup founders.Being a Fractional CTO: What does it mean?Laimonas began his fractional career in early 2024. As he admitted, before that he even hadn’t known that such roles exist nowadays. According to him, he discovered the concept by chance through a LinkedIn post from another fractional CTO. This inspired him to explore the field.A fractional CTO operates as a hands-on consultant and provides technical leadership to companies that don’t require a full-time CTO. This role is particularly beneficial for non-technical businesses like marketing agencies and small pharma companies, as well as early-stage tech startups. Such teams may not need a full-time executive but they still require expert guidance to avoid common pitfalls.Unlike a traditional CTO, a fractional CTO is available on a part-time basis. It can be a few hours per day or even just a few hours per week.What is important to highlight here is that this person is not a third-party consultant. This specialist is a full-scale team member, despite the limited hours that he or she devotes to your business per week.This expert helps businesses navigate technical challenges, streamline processes, and make informed decisions.The fractional model extends beyond CTOs to other executive roles, such as fractional CMOs and CFOs. And all these roles follow the same principle. These professionals provide their strategic expertise without being full-time employees.For a little bit less than a year, Laimonas worked as a fractional CTO. Nevertheless, now he has a full-time job. And here are the key pros and cons of a fractional role that he defined.Advantages of being a fractional executiveAmong the benefits, Laimonas highlighted the flexibility and security that come with a fractional career. Fractional employees can choose their projects and work with multiple clients.Moreover, this approach helps to reduce financial risk. If you lose one or two clients, it doesn’t mean that you will lose all your income at once.In other words, a fractional executive operates as a one-person business and can maintain great autonomy.Disadvantages of being a fractional executiveHowever, this independence also comes with challenges. Fractional professionals must handle not just their core expertise but also a wide range of other tasks, including sales, marketing, and client acquisition. All these activities are traditionally managed by entire departments in a business.As a result, Laimonas shared that a significant portion of his time was spent on prospecting, lead generation, and outreach rather than on his actual technical work.For specialists like fractional CMOs, CFOs, or CTOs, the ideal scenario is to focus solely on their expertise. In Laimonas’ case, his passion lies in technology, not in sales or marketing. Constant business development efforts could be very draining and that’s the key disadvantage of this career path.How the AI landscape is changingAs artificial intelligence remains one of the most widely discussed topics today, Max and Laimonas also couldn’t omit it in their conversation.Laimonas joined the AI space long before this technology became mainstream. He has been building AI-based products since 2014.Over the years of his work, he witnessed how AI development has changed with the emergence of large language models like ChatGPT. Previously, AI required hands-on data science, machine learning experimentation, and model deployment. Today, AI is more accessible. Developers can integrate it into products with simple API calls, avoiding the need for complex model training. This shift has allowed businesses to incorporate AI quickly and transform non-AI products into AI-powered solutions sometimes in a matter of hours.According to Laimonas, earlier many startups approached AI as a standalone product rather than a tool. Laimonas mentioned Rabbit R1 and AI Pin as examples. These are gadgets designed to function as AI-powered assistants. Nevertheless, they failed. It happened because they lacked a strong foundational business model.Today, it has become obvious: AI is not a product in itself but a feature that can enhance existing solutions.Laimonas believed that in the future AI will continue to be a powerful tool for gaining a competitive advantage. However, success will depend on integrating AI into solid business ideas. It will work much better than just relying on AI as the core offering.AI market realitiesAccording to the article published by Sequoia, one of the biggest VC firms, the vast amount of capital poured into AI-based solutions now requires an additional $500–600 billion in revenue across these companies for investments to break even. At the moment, it’s difficult to say whether this target is achievable or not. However, it brightly highlights the significant financial pressure on the AI sector.Laimonas mentioned that the gap between business profitability and AI investments exists not only for startups but also for major players like Google, Meta, and Microsoft. These tech giants lead AI development today because only they can afford the immense costs of training large-scale models. Such efforts often require tens or even hundreds of millions of dollars.Despite such a market situation, investors remain optimistic. This can be seen in the steady growth of the S&P 500 index, which tracks the stock performance of 500 of the largest companies listed on the US stock exchanges. However, here we can observe a notable concentration on the so-called “Magnificent Seven”. Seven major tech firms (Microsoft, Meta, Tesla, Amazon, Apple, NVIDIA, and Alphabet) make up nearly 30%-35% of the index.The last time when such a concentration was observed was in the dot-com bubble era.AI: Is it just another bubble?Laimonas sees obvious similarities between the current AI hype and the early 2000s internet boom. The internet was also a revolutionary technology. It went through a speculative bubble that eventually crashed before stabilizing into long-term growth.Could this happen to AI as well? The expert believes AI is following a similar trajectory. There was an initial boom. Now we can expect a likely correction that will ultimately result in a lasting impact.According to Laimonas, AI is definitely a very good technology. Nevertheless, it is already being weaponized. Deepfake videos of world leaders, AI-generated propaganda, and automated disinformation campaigns are becoming widespread. Large language models, when integrated into social media platforms, further amplify misinformation. That’s why it’s also worth taking into account this “darker” side of AI while analyzing its role in our society.The value of feedback for startup foundersLaimonas emphasized that one of the most important lessons for new founders is accepting that their initial ideas can be flawed. In the beginning, a startup’s vision is rarely perfect, and founders must be willing to refine it. Instead of treating an idea as something sacred, they should focus on building a minimum viable product (MVP), testing it, and gathering feedback.The reality is that most early concepts will fail. However, failure is part of the process. Founders must continuously iterate. This should include seeking feedback, adjusting the product, and repeating the cycle. All this should be done again and again until product-market fit is achieved. The key is to remain adaptable and recognize when something gains traction.However, not all feedback is equally valuable. Some users may explicitly state why they don’t like a product. For example, they may explain that they stopped using a product because the price is too high or because it doesn’t address some of their needs. That’s a very helpful type of feedback.Nevertheless, more often, the feedback is implicit: users simply don’t engage. In such cases, founders must investigate why it has happened. This requires reaching out to former or inactive users, analyzing usage patterns, and identifying the reasons behind low adoption.Deep, specific feedback is crucial to making the necessary improvements that lead to success.Why full-stack for early startups?In early-stage startups, achieving product-market fit requires rapid iteration cycles. The faster a startup can implement and test changes, the higher its chances of success will be. The chosen technology plays a crucial role in this process. It can either accelerate development or become a bottleneck. It is the responsibility of a technical co-founder, fractional CTO, or experienced consultant to ensure the right technological choices are made to support fast iteration.Traditionally, technical teams are structured with dedicated backend developers, frontend developers, QA specialists, and sometimes mobile engineers. While this model worked well in the past, it is often too slow for modern startups that need a competitive edge.As a response to such market needs, full-stack frameworks and technologies have started gaining popularity. They integrate multiple aspects of development into a single streamlined system.Frameworks like Next.js and Vercel provide infrastructure, frontend, and backend capabilities in one codebase. As a result, they enable faster deployment and iteration. However, these technologies come with some pitfalls, such as vendor lock-in. To fully unlock Next.js’s benefits, software developers often need to use Vercel, which can be costly and restrictive.Other frameworks, such as Remix, offer an alternative approach. For instance, Remix allows developers to write frontend and backend logic within the same file. This might seem disorganized at first. However, following strong design principles can result in a well-structured and efficient system.A single full-stack developer in such a case can often outperform a traditional five-person team consisting of separate frontend, backend, and QA engineers. The key advantage lies in eliminating communication overhead and reducing knowledge gaps. In other words, one developer can deliver all features without dependencies on other specialists.This shift toward full-stack development, combined with AI-assisted coding tools, significantly shortens iteration cycles. Features that previously took a full day to implement can now be developed in a fraction of the time.For startups aiming to stay agile and efficient, prioritizing generalist developers, who can build entire features independently, is more effective than hiring narrow specialists. Specialization should come later when the team grows to a size where dedicated roles in infrastructure, frontend, backend, and QA become necessary. Initially, focusing on generalists ensures maximum speed, flexibility, and resource efficiency.Balancing the concentration on today and tomorrowStartups must strike a balance between focusing on immediate survival and planning for the future. While long-term vision is important, over-prioritizing future scalability at the expense of present execution can be fatal. If resources are not managed well and iteration cycles are too slow, a startup risks running out of cash before it ever reaches the future it envisions.The priority should always be profitability and survival.Scalability issues, expansion challenges, and the need for team specialization are all positive problems. They signal that the business is working, clients are coming in, and revenue is growing. Growth problems indicate success, whereas failure to manage short-term sustainability can lead to an early shutdown.Some kind of uncertainty is an inherent part of the tech industry. Tech teams constantly need to solve scalability problems. While the nature of these problems evolves, the challenge itself never disappears. Mature IT leaders and software developers must recognize this uncertainty and design solutions, architectures, and infrastructures that accommodate future changes.A well-structured codebase should reflect the uncertainties of the business. It must be flexible enough to adapt to different directions as the company expands. Designing with adaptability in mind ensures that as business needs shift, the technology can keep up without requiring a complete overhaul.Where to find the right mentorship?For early startups, it is also vital to have people who will professionally guide them at least at the initial stages of their development.While many mentorship services are available online, they often lack a very important element. This key element is trust. It is difficult to assess a mentor’s true experience, expertise, and quality of services without firsthand knowledge.Instead of relying solely on external help from the internet, startup founders should first turn to their personal networks, including friends, former colleagues, business partners, and industry acquaintances. These trusted connections can either offer direct guidance or introduce founders to experienced professionals within their networks.Human connections are invaluable. In the startup world, relationships often open doors to mentorship, partnerships, and new opportunities that wouldn’t be accessible otherwise. Entrepreneurs should prioritize building and maintaining strong professional relationships, as these connections often prove more beneficial than any formal mentorship services.The journey of building a tech startup is filled with challenges: from finding the right idea to managing scalability. According to Laimonas Sutkus, flexibility and readiness for iterations are among the key components that can drive a tech startup to success.Want to learn more about technologies and their role in the business world? Don’t miss the next episodes of the Innovantage podcast where its host Max will welcome new experts in his studio.
Business Strategy & growth
5 observations about growing companies
March 16, 2025
9 min read

Learn the top strategies for growing companies, including customer value, market-based assets, commoditization, and hypercompetition. Insights from Maxim Golikov’s studies at Antwerp Management School.

By Maxim GolikovCBDO @ Sigli | Digital Transformation, Product Development, Tech Business ConsultingReflecting on studies in the Antwerp Management SchoolSome people say that it doesn’t matter what you study, it’s more important what you learn. But when you study really useful things and manage to apply them to your work — that’s a perfect combo. That’s exactly how I felt after finishing my studies at the Antwerp Management School. And that, in and of itself, could have been enough.But in practice, some lessons remain a larger part of your day-to-day than others. Some are more applicable, more tangible, and more relatable. The more I thought about it, the more I wanted to write some of it down.And now we’re here.I am not going to retell everything that I was lucky to learn from my professors. Instead, I want to share my own interpretation of what I heard. I didn’t come up with these concepts, and can’t claim ownership of them. However, I do find them useful, and I hope you will as well.There is quite a lot of information that I want to share. But in this first article, let me start with 5 Observations on what growing companies do to achieve said growth. This is mostly based on a lecture by prof. Koen Vandenbempt.Observation 1. They focus on superior customer valueIt’s absolutely obvious that the growth of some companies and the stagnation of others is not just a random occurrence. There is something in the nature of these companies that makes them either move forward or stay where they are.But what is that “something”? Have you ever tried to detect this on your own based on common real-life examples?Non-growth companies are usually those that deliver mainly product offerings and focus only on the efficiency of their processes. As a result, they face low profitability and lack of growth.Companies that can successfully grow, as a rule, offer services and invest their efforts into bringing value to their customers. Yes, they still take care of their efficiency and may choose different approaches to creating value. But everything they do is aimed at enhancing the benefits that their customers can leverage thanks to their offerings. And that’s one of the key components (and secrets) of their growth.It’s worth noting that there can be two approaches to value creation. The first one is internally oriented. In this case, companies are focused mainly on increasing their efficiency.Courtesy: prof. Koen VandenbemptThe second approach is more about external orientation, value-based logic, and the implementation of new concepts and innovations.Courtesy: prof. Koen VandenbemptIn the end, you have to focus on either creating or capturing value to support continuous, sustainable growth.Observation 2. Core growth is keyTraditionally, when we think about business growth, we start considering various “tactics” that can boost it like expansion to international markets, price increases, aggressive accounting practices, mergers and acquisitions, etc. In other words, in such a situation, you try to get fruits from what you have, without introducing any fundamental changes. Nevertheless, such efforts are not enough.But what will work in this case? What should be taken as the main key to the core growth? These are market-based assets, such as customer relations, networks, partnerships, strategic market intelligence, etc. Growth companies work not only with physical resources but also with their intangible assets.And here it’s necessary not to forget about the importance of technology and design competencies. That’s all about the power of digitalization and delivering cost-efficient customer-centric solutions that represent real value.Courtesy: prof. Koen VandenbemptThis means that growth cannot be achieved synthetically, at least not on the long run. Yes, you can optimize a lot of your business, but if the core is weak it is only a matter of time until it will crack.Observation 3. Commoditization happens, alwaysIf you’ve never heard of commoditization, allow me to explain in simple terms. This process can be defined as the transformation of services or products into a standardized, marketable object that provides customers with a tangible value.Very innovative products (or services) that still haven’t found their regular consumers can’t be considered a commodity. However, when they gain their clear use cases and people see what value such offers bring, that’s already a completely different story. Very often such commodities become nearly indistinguishable from similar offers except for their price. And that’s where businesses should think about enriching their products and services with unique value.Let’s take plane tickets as an example. Today they are a commodity. People know what they are, how they can use them, and what value they will get when they buy these tickets. However, it was not always so. A few decades ago there were very few flights and tickets were absolutely inaccessible to the average person. But with time, the situation has changed. Today you have low-cost providers like Ryanair, premium jets that carry celebrities halfway across the world for a croissant, and small-scale private aviation with loads of options in-between. Flying has become a commodity.And probably, something similar may happen to space travel. Today “space tickets” are not a commodity. They are too expensive, too impractical, and their value for a wider audience can be best defined as “controversial”. But in the near future, who knows…Can commoditization be a feature of growth companies? Definitely yes. Growth companies should be good at meeting customers’ needs and creating offers that will include segment-oriented value propositions.You may say that commoditization (which is associated with a better market position for a company) leads to falling profitability. And, yes, it can be an alarming sign. Nevertheless, it is not exactly so. Falling profits can become an excellent incentive for businesses to look for ways to create new values for their customers in order to win a bigger market share. And that’s a continuous process that works like a perpetual motion machine.The only way to fight it is to understand that it is inevitable, and the process is unending. To escape the squeeze you must find new value for your company and clients.Observation 4. Hypercompetition is a fact of lifeCompetition fuels innovation. That’s only one side of the coin. Competition can also ensure intensified warfare for the customer. And today we can observe the competition across many points, including prices, branding, innovation, offer differentiation, and others. It exists in all the markets, including niche ones.Growth companies are always ahead of the game. What is required to achieve this? To attract your customers’ attention and to do it faster and better than your competitors, you need to deeply understand the needs and expectations of your target audience. Getting customer insights, interpreting market trends faster than others, and being ready to break the rules by crossing industry boundaries and relying on unstandardized segmentation — these are just a couple of things that you should do.In the conditions of hypercompetition, companies need to innovate fast, look for new opportunities, and deliver clear value to customers. Where can we observe the effect of hypercompetition today? Actually, across many sectors, including the smartphone market (where Apple, Samsung, Huawei, and others are fighting for the attention of their target audiences); online retail with such eCommerce giants as Amazon, Alibaba, and Walmart trying to win a bigger share; and video streaming with Amazon Prime, Disney+, and Netflix.But let’s dive deeper into some real-life examples. I have two opposite cases that come to my mind.In the 20th century, Kodak was practically the synonym of photography. The company’s tagline “Kodak moment” even entered the common vocabulary and was used to describe personal events that were worth being recorded for the future. It was the unrivaled leader in the film photography market. But what happened next? The first pitfall that it faced was the increasing competition from Fujifilm in the late 1990s. However, the main obstacle that it couldn’t overcome was the transition from film to digital photography.Despite its attempts to move to this technology and even the presentation of its first self-contained digital camera, Kodak hesitated to push forward with innovation in this area. The main fears of the company were related to concerns that digital photography would cannibalize its film business. Nevertheless, this refusal to innovate became fatal for Kodak.On the flip side, we can mention Apple. When it started its business journey, it wasn’t leading the mobile phone market. However, the introduction of the iPhone in 2007 revolutionized mobile technology and significantly changed our understanding of what such devices could offer us in terms of functionality and interfaces.While other market players were focused on introducing improvements to their existing technologies, Apple’s strategy was to redefine the entire user experience. Thanks to this bold approach, the company managed to win leadership not only in the smartphone market but also in the tech space at large.So what can we see here? Due to its reluctance to innovate, Kodak had no other choice but to file for Chapter 11 bankruptcy protection in 2012, which further resulted in the company’s reorganization. Meanwhile, as of May 2024, with over $2.8 trillion, Apple is one of the largest companies by market cap all over the globe. It has one of the most recognizable brands in the world, and it continues to evolve. I strongly believe that this example perfectly illustrates the power of innovation in achieving market dominance.Regardless of your current market position, competitive advantage, technical superiority, or any other factor you may feel makes you immune — competition will squeeze you out if you refuse to innovate.Observation 5. Customer retention is a mustIf you are building your own business, you’ve probably noticed that to attract customers, you need to promise something to them. But to retain you need not only to give it to them but also to adjust this to their needs. Customer retention is a rather complex process that requires continuous improvements and a deep understanding of people’s desires, interests, preferences, and demands.Growing companies usually focus on expanding their market share with existing customers and that’s their power.I am sure that you’ve heard about a sales funnel which is used to describe the journey that potential customers go through, from attracting clients to closing a sale. Nevertheless, growth companies shouldn’t consider the fact of a sale as a final goal. It’s much more important not just to sell as much as you can but to retain as many customers as possible. Otherwise, all your efforts won’t make any sense in the long run.Courtesy: prof. Koen VandenbemptAs opposed to your traditional sales funnel the customer retention funnel is inversed. You start with your customers already there, and you need to get them to a point you want them to be. To do this you need a deep understanding of customer needs, introduce integral service offerings, and finally arrive to co-development with existing clients. This can create unique solutions and even services that can support your company growth and competitive advantage.For growth companies it’s not enough just to push a customer to make a purchase. They view these two funnels as two integral components of one unified process of working with customers. And the final goal of their work is to build long-term relations with customers full of trust and mutual benefits.ConclusionWhat was surprising to me is that continuous growth is not about being the best at what you do, or having the largest market share. It’s about capturing value in an environment that tries to kill your company, and you should let it. The secret is that by the time commoditization or competition squeezes you out you should already have several more options for growth on their way.And the best way to do it? Your customers. Create value for them on a permanent basis, be ready to change along the way, and you will be able to cultivate true innovation and count yourself among the ever illusive group of growth companies.The bad news is that there already are growing companies in every industry, and somehow you need to compete with them. They are helmed by smart people who have been doing everything I described for years already. What to do with this competition?..That’s a story for the next article. Stay tuned!Contact Maxim:maxim.golikov@sigli.comhttps://www.linkedin.com/in/maxim-golikov/‍
Web Development
Serverless future: Why many businesses are saying goodbye to servers
March 4, 2025
10 min read

The podcast host and the CBDO at Sigli Max Golikov invited Michael Dowden to speak about the decreasing significance of servers today, as well as the benefits businesses can gain from this shift.

One of the key goals of the Innovantage podcast is to let its audience learn more about the latest tech trends and explore how they are transforming the business landscape. And the recent episode serves exactly this purpose. This time, the topic of serverless technology has taken center stage.The podcast host and the CBDO at Sigli Max Golikov invited Michael Dowden to speak about the decreasing significance of servers today, as well as the benefits businesses can gain from this shift.Michael is a technology leader, international speaker, and serverless expert with more than 30 years of experience in software development and consulting. Of course, serverless technology hasn’t been around for that long. It started gaining adoption nearly 15 years ago. Given this, Michael has a very good understanding of how the shift to the new technology has happened. Moreover, he explained the reasons why businesses started ditching their servers and what are the cases when using servers is still feasible.This and much more were discussed in that episode and the most interesting ideas we’ve gathered for you in this article.Serverless computing: What is it?The core concept of serverless doesn’t mean there are no servers. They are still there, managed by providers, not by businesses directly. The idea of this technology represents an additional layer of abstraction. Everything started with servers, then moved to virtual machines, followed by Infrastructure-as-a-Service (IaaS), and Platform-as-a-Service (PaaS). Serverless takes it a step further by providing only the essential operating containers required to run your tasks.One key aspect of serverless is Function-as-a-Service (FaaS). Similar to microservices, each function is managed and scaled independently. You can build, deploy, and scale individual units of code without worrying about the entire system. If traffic spikes, serverless platforms scale automatically by creating copies of the function to handle the load. Once the traffic decreases, the platform adjusts by scaling back.With serverless, you are charged only for the actual computing time used, and you don’t have to manage the infrastructure yourself. The platform scales according to your needs, making it efficient and cost-effective.In other words, with serverless technology, you are basically outsourcing a part of your infrastructure to make it lighter, leaner and more responsive, better in every possible way.Michael shared his experience of transitioning to serverless technology while working at startups. His goal was to find cost-effective solutions that would scale as the businesses grew. The first shift to serverless was not planned. It just happened naturally. Michael’s approach to software had always been user-focused. It means that he starts building the front-end and UX to quickly prove concepts and gather feedback from users.One of the startups he worked with began as a progressive web application. As the product evolved, the team realized they needed a backend, and serverless was the logical choice to support their growth.Downsides of going serverlessThough the idea of serverless may sound highly appealing, businesses also should stay aware of possible downsides.They become obvious when you need to execute control at a low system level in a very specific way. While you can adjust parameters like memory allocation on platforms like GCP, AWS, or Azure, you don’t have full access to the underlying system. Additionally, you might lose visibility into the scaling process. For example, it may be impossible for you to control how many instances of your function are running at any given time. This lack of proper control over scaling and thresholds can be a challenge.Another issue is the cold start problem. When a function is triggered, it often needs to spin up, typically using something like a Docker container. This setup takes time (it can be a couple of seconds). This leads to a lag before the function can serve traffic, which might be noticeable to users.Observability becomes crucial in serverless environments. Without direct access to the system, you need to rely on external tools to monitor your code and infrastructure. If an issue arises, it can be hard to pinpoint the cause without proper monitoring in place.Serverless is great for small, independently managed functions. But it has limitations for long-running services (often capped at 5 minutes) or applications requiring extremely low latency. While serverless offers massive scalability, the slight latency in ramping up can be a concern.When is serverless a good option for you? Serverless-first approachAs with any technology, it’s essential to carefully consider your use case before choosing serverless.Michael admitted that he advocates a serverless-first approach, especially for new projects. He believes that most companies and projects should start with serverless by default. When you are unsure about the application or just starting out, serverless should be the initial choice. You should consider a different architecture, only when you understand why serverless might not be suitable. This can be relevant, not for the entire project but for its specific parts.One key reason for implementing the serverless-first approach is that this technology allows you to build faster. You spend less time setting up infrastructure. This enables you to start running your project quickly. Moreover, this technology scales automatically. So if your project experiences unexpected traffic spikes, it can handle them with minimal effort.By starting with the serverless-first approach, you can focus on getting your product in front of users while managing traffic. This gives you the flexibility to learn what works and what doesn’t, without worrying about infrastructure bottlenecks.Michael also mentioned the following case.Amazon is well-known for its use of serverless infrastructure in its video streaming service. The company once published an article explaining how they stopped using serverless for one component of that service. However, the article was widely misinterpreted. Due to the provoked reaction, Amazon eventually retracted it. Despite the misunderstanding, the article was a great technical explanation of that decision.The company explained that the specific part of the project had some requirements that their serverless infrastructure couldn’t meet. So, they changed the architecture of just that one component. This helped them save money, improve the user experience, and make the system more efficient.According to Michael, this is a perfect example of the serverless-first approach. Amazon was able to build and run the service in front of customers for months or even years before realizing they needed a different solution. They had the time to learn, design a better approach, and successfully implement it. This, he believes, is a huge success story for serverless.Value of serverless computingOne of the key economic advantages of serverless, especially for startups, is its cost-efficiency. Many startups invest heavily in infrastructure before acquiring a single customer. With serverless, infrastructure costs remain low until traffic increases, allowing expenses to scale with revenue. This model ensures that businesses only pay for what they use, aligning costs with growth.Another major benefit is flexibility. Unlike monolithic architectures, where components are tightly integrated, serverless systems are highly modular and event-driven. Functions, databases, and services operate independently, making it easier to scale, reorganize, or optimize specific parts of the system as needed. If two functions need closer integration, they can be easily adjusted without restructuring the entire architecture.Starting with a modular serverless approach allows businesses to adapt more easily over time. It’s much simpler to merge independent services when necessary than to untangle tightly coupled components in a monolithic system. This adaptability makes serverless an ideal choice for companies looking to scale efficiently while maintaining agility.Concept of “good” codeAccording to Michael, a key test of adaptability in software is whether a piece of code can be removed without disrupting the rest of the application. It doesn’t matter whether you need to introduce a new implementation, update a feature, or improve functionality, in most cases, code will need to be replaced. Flexibility is essential, and well-structured code should allow for seamless modifications.In a modular architecture, where functions are deployed independently, removing or replacing a function is straightforward. If a service is no longer needed, it can be deleted without affecting other components. If an update is required, a new function can be introduced without major disruptions.However, sometimes the issue lies not in the function itself but in the process behind it. To improve user experience, it is often beneficial to handle certain operations in the background while allowing users to continue to the next step. Even when latency is a concern, effective solutions can mask delays and ensure a smooth and responsive experience.Is serverless technology only for startups?While serverless computing is often associated with startups, it is also highly valuable for large enterprises (if not even more). Some of the biggest adopters of serverless technology are actually the companies that provide it: Google Cloud, Amazon Web Services (AWS), and Microsoft Azure. These tech giants didn’t create serverless computing for startups alone. They use it extensively themselves.One of the primary reasons serverless is so beneficial at scale is agility. It allows enterprises to deploy new features and services without being directly tied to complex infrastructure work. While these companies invest heavily in infrastructure and employ some of the best hardware specialists in the world, their service developers don’t need to focus on hardware management. Serverless enables them to build and deploy applications faster and keep innovation cycles short and efficient.Large enterprises also often need to deal with ongoing legacy system upgrades. Many companies operate in a continuous cycle of replacing outdated platforms with modern solutions. Serverless offers a strategic way to introduce modular, scalable services into these transitions. By gradually integrating serverless technology, businesses can break apart monolithic architectures, reduce infrastructure overhead, and create a more flexible and efficient system. And all this is possible without the need for a full-scale overhaul.Future of serverless technologyA few years ago, serverless computing was surrounded by significant hype and some people believed that serverless could be used for everything (which is not true). Now, this hype seems to be over as we have reached a so-called plateau. Some developers may not be familiar with it at all. However, serverless architecture is not a passing trend. It is here to stay. The question is not whether serverless will remain relevant but rather how widely it will be adopted and in what form.In the coming years, more companies are expected to adopt a serverless-first approach, where serverless is the default choice unless specific needs dictate otherwise. However, this shift will not happen across the board, as different businesses have varying infrastructure requirements.It’s difficult to make industry-wide predictions, as serverless is just one of many architectural choices available. Over time, the term “serverless” itself may fade, with the focus shifting toward broader cloud-native patterns rather than a distinct category.Another key area of change could be the pricing model. Today, serverless operates primarily on a pay-as-you-use basis. This provides cost efficiency and scalability. However, companies may also see opportunities to reduce expenses by purchasing capacity in bulk. This will be quite similar to traditional cloud computing models where reserving resources upfront can be more cost-effective. This shift could help businesses optimize spending while still leveraging the benefits of serverless technology.Apart from this, Michael explained that serverless computing will continue evolving. It’s highly likely that we will see new patterns emerging to streamline its adoption. Now each company needs to invent its own approach. Nevertheless, established best practices and frameworks could guide serverless implementations and make this technology an even more viable option for businesses of all sizes.Challenges and opportunities presented by emerging technologiesWhile speaking about the impact that new technologies may have on the world, Michael stated that the consequences of the adoption of some innovations can be quite worrying.Machine learning (ML) has been around for decades, but large language models (LLMs) are still relatively new. While they have specific strengths and weaknesses, the current hype makes it difficult to fully assess their most effective use cases. As the technology matures, a clearer understanding will emerge regarding what LLMs are truly efficient at and where their limitations lie.Beyond technical capabilities, a critical aspect of evaluating any architectural decision is its impact on the world. And here we should consider them not just in terms of performance and cost but also sustainability. Energy consumption, carbon footprint, and ecological impact are all essential considerations.The ability to make informed decisions about minimizing negative effects and amplifying positive ones is crucial. However, when it comes to LLMs, this level of control is still lacking, which raises concerns about their long-term viability from an environmental perspective.Serverless computing, on the other hand, offers a more sustainable approach to infrastructure. It inherently optimizes resource usage, scaling only when needed. Running dedicated servers, by contrast, can lead to wasteful energy consumption, making serverless a more viable option for reducing carbon footprints.How to choose the right infrastructureAt the end of their discussion, Max also asked Michael to share his recommendations for businesses that need to choose the type of infrastructure for their systems.According to the expert, your team’s existing skill set should be a key factor. Of course, it’s possible to push a team to learn, adapt, and grow. However, leveraging their current expertise can often be the most efficient approach. If your team has deep knowledge in a specific area, it might make sense to build around that strength rather than forcing a transition to an unfamiliar technology stack.Michael mentioned that some companies make infrastructure decisions that require them to hire entirely new tech teams just to support the shift. While this can bring in fresh expertise, it also introduces risks, costs, and potential disruptions.Another crucial consideration is risk mitigation. Relying on a single location for server management or cloud services can create vulnerabilities caused by outages, security breaches, or physical disasters. A robust disaster recovery strategy is essential to ensure business continuity.For high availability and resilience, companies should consider multi-cloud strategies. This could mean using:Multiple cloud providers to reduce dependency on a single vendor;Containers deployed across different cloud environments.A multi-cloud approach can significantly enhance reliability and performance. Nevertheless, it requires investment in tools, expertise, and governance to manage complexity effectively.As you can see, the implementation of every technology (even the most promising one) can be associated with a range of pitfalls. And it is always better to stay aware of them beforehand.If you want to learn more about tech innovations that are expected to change the world, don’t miss the next episodes of the Innovantage podcast!
AI Development
AI and digital transformation: Practical tips for powerful changes
February 3, 2025
10 min read

One of the goals of the Innovantage podcast is to help businesses better understand what is happening in the digital world and how they can leverage tech advancements to improve their processes and get benefits in the long run. This episode fully aligns with his goal as it is dedicated to digital transformation at enterprises and the most efficient approaches to it.

One of the goals of the Innovantage podcast is to help businesses better understand what is happening in the digital world and how they can leverage tech advancements to improve their processes and get benefits in the long run. This episode fully aligns with his goal as it is dedicated to digital transformation at enterprises and the most efficient approaches to it.To cover this topic, Max Golikov, the podcast host and the CBDO at Sigli, invited Stijn Viaene to join the discussion.Currently, Stijn is a professor and partner at Vlerick Business School & KU Leuven. One of the core subjects that he is working on is exploring the best ways to create business value with investments in technology. His career started 25 years ago with research on insurance fraud. But some years later, when the concept of digital transformation entered the game, Stijn focused on it. Now, for more than a decade already, he has been working in this domain, which allowed him to get great experience and an impressively deep understanding of the peculiarities of this process.What is digital transformation?Digital transformation has become a hot topic these days. However, in many companies, there is still a lot of confusion and disagreement on what it actually means. For some of them, this process presupposes the implementation of tech solutions into their operations. Nevertheless, just using technology, even the latest one, doesn’t make this usage transformational.According to Stijn, digital transformation is a strategic way to respond to the threats of not surviving and the opportunity to thrive in a digital economy. The main idea behind the implementation of any tech solutions should be changing an organization for the better.What is the key difference between digital transformation projects and just tech projects?Stijn explained that digital transformation projects are not only new solutions that you invest in. They are not just shiny things that may look exciting or trendy. They should always be based on a strategy and a clear vision of how you will reshape your enterprise in the process of their realization.Of course, today, when there are so many new products and tools, businesses may be confused with all this variety. However, implementing all of them at once won’t bring any value. It’s much more important not just to follow all the trends but to create a well-thought-out plan for a long-term sustainable transformation.How to plan transformation in a quickly changing tech world?It’s quite obvious thing that the digital world is very dynamic. Given this, people who are preparing for digital transformation in their organizations have to start looking at things in the right way to find the best approach to such changes. Highlighting this, Stijn shared three tips.Acceptance of the reality. We are living in the VUCA (volatility, uncertainty, complexity, and ambiguity) world. We can’t deny that there is a lot of turbulence these days and it is impossible to hide or run away from it. There is no sense in waiting for someone to come up with a magical recipe and solve all the issues. If you get to navigate that turbulence better than your competitors, then you will win the game. That’s why the first thing to do is to accept that the world is turbulent. But at the same time, this situation provides a lot of opportunities to win the competition.Systemic view on changes. You also need to understand correctly the nature of the change you will get yourself into. This nature of the change is systemic. It means that digital transformation is not just a task for your technology department or marketing team. Digital transformation should cover all elements and levels of your organization.The right mindset. One of the worst things that people can do at the beginning of such a transformation journey is to look for an easy way out. There is no single tool kit for conducting transformation. People should understand that there is a lot of work to be done and be ready for it.Digital transformation is not something that you can do in a year or two. This process won’t stop. It’s not just a project with clear timeframes. There will be a continuous stream of digital opportunities and threats that will come in the future and, as Stijn explained, there won’t be any end to it.Challenges and peculiarities of digital transformation for SMEsTransformation journeys for large enterprises and small and medium-sized businesses are quite different. This is explained by the peculiarities of such organizations, as well as the opportunities that are available for them.The main difference is related to the fact that smaller businesses are more restricted in resources. Given this, the need to focus on some projects and initiatives is bigger for them. They can’t allocate much attention to many things at once. This can become a difficulty for such companies.But at the same time, they have more opportunities to work in partnerships with other organizations. Smaller businesses are usually more open to joining forces with others than big corporations that usually perceive the necessity to collaborate with other market players as a weakness or as a chance to dominate.In reality, for many businesses, the strength lies exactly in alliances and partnerships in the ecosystem of equals.Partnerships can (and should) be win-win, which means that both parties need to be not just looking for the benefits but also willing to invest their efforts and resources in such common projects.Balancing short-term wins with long-term strategyOne of the major challenges in digital transformation for everyone is overcoming the mindset that prioritizes short-term gains. Many organizations strive for immediate results, often at the expense of long-term value. This mindset is antithetical to successful digital transformation because true success requires consistent investment in infrastructure, processes, and cultural change.However, completely ignoring short-term wins is not the solution either. Human psychology demands instant gratification. Therefore, offering periodic quick wins can help maintain motivation and engagement. The key is to frame these short-term achievements as steps toward the larger goal, ensuring they align with the long-term strategy.How to change mindset at an organizationStijn highlighted that changing an organization’s mindset is one of the hardest challenges leaders face during digital transformation. The key is not to fight the existing culture but to work with it. Leaders must identify the gap between the current mindset and the ideal mindset, and then carefully plan strategies to bridge that gap. This involves choosing battles wisely because some areas may not be worth changing just at the very beginning, while others will require significant focus.Good leaders also should realize that they cannot change everyone. It’s essential to identify individuals who are resistant to change and find ways to either confine their influence or part ways if necessary. It is much better to spend energy on those who are open to change as well as on new hires who will bring fresh perspectives and align with the desired mindset. Such people can act as agents of transformation.Talent remains one of the most valuable resources for organizations undergoing digital transformation. The global competition for skilled workers, particularly in tech, is high. Organizations need a clear talent strategy to attract, retain, and develop the skills required for success in the digital age.Short-term layoffs for cost-cutting might seem appealing but they always come with risks. Firing thousands today with the hope of rehiring tomorrow is neither fair nor sustainable. It can negatively affect the company’s reputation and lead to a loss of critical talent to competitors.AI: Yes or No for digital transformation?According to Stijn, currently, we are at a moment when a lot of people realize that certain assumptions about AI that they might have made in the past are no longer necessarily true.One of such assumptions is related to how AI can be applied. Many of us used to think that people will always be responsible for everything that involves creativity and curiosity, while technology will perform some routine and boring work.However, since 2012–2013, when the world just started talking about digital transformation, this illusion has gradually disappeared. Now, with such models like ChatGPT, significantly more things than we expected are possible.Today we can hear the brightest writers say the best poems that they read in the last couple of years were written by an AI. This proves that AI actually can do things that are based on creativity.It’s interesting that in the context of discussing the potential of this technology, there are also a lot of talks about the nature of our humanity, our role, and our real differences from AI. There are also questions related to the ethical principles underlying the use of this technology and the risks that are onboarded by companies and societies when introducing AI.Stijn believes that before starting to use artificial intelligence, we need to find answers to a row of important questions.How will we use AI? Will we just put it in the function of automating work or will we try to prioritize the augmentation of humans in the work environment?Actually, the second option doesn’t exclude automation. However, it puts automation at the level of support for augmentation. Here, Stijn stressed that the business cases for the augmentation of AI will be totally different from the business cases for AI automation.People like Elon Musk have a quite transformative vision of the future. They have already tried to build entire factories that were expected to run with zero workers. Such factories didn’t work. But that’s not because the people behind such projects didn’t believe in such projects. That’s just because the technology was not yet ready.In the future, it will be vital for the leadership and companies to show their real stance on the relationship between human workers and technology. That’s an important point that will demonstrate their vision of the future not only for their businesses but also for society in general.If a lot of companies follow Musk’s principles, the distribution of wealth in the world will be incredibly uneven. Based on their vision, companies define in what technologies they will invest. The link between these choices and the survival of companies becomes very tight.Stijn explained that if he had to make a decision regarding such investments today, he would put money into transforming a company into a learning organization. In such an organization, everyone strongly believes that learning is part of their job.Top tips for digital transformation consultantsToday, a lot of companies that are planning digital transformation, invite third-party consultants or companies to guide them through this journey. As Stijn quite often acts as such a coach and consultant, Max asked him to provide practical recommendations to those people who are employed for such tasks.Tip 1. The main thing that any company providing such services should do is to prepare a really good answer to one question: “What makes you the best digital transformation partner for your customers?”. And what is even more important is that the answer shouldn’t differ when the same question is asked to any employee at this company.Tip 2. As a consultant, you need to have, demonstrate, and help to develop certain mindsets. Given this, consultants shouldn’t say “yes” to everything a customer asks them for. Consultants should be ready to be sparing partners. They need to be critically constructive. They should walk away when they really think that something is not going to work, while customers still insist on their own vision. It can be very tough because some good money might be involved. Nevertheless, namely, this can help to create the right reputation.Tip 3. There’s also competition between consultants. That’s why it is very important to stand out from the row of companies with similar services. To do this, it is required to establish close contact with customers and make them part of your tribe.Recommendations for executivesStijn understands the needs and challenges of all parties that can be involved in the process of digital transformation. In the discussion with Maxim, he also shared his ideas that can be helpful for executives who are planning to start digital transformation at their organizations.Your benchmark for whatever is good should be outside the company. Never think that you have all the people you need around the table. What is good often lies beyond the boundaries of your enterprise. That’s why you need to create your strategy and set your goals based on what you see around the organization, not inside.The right mindset matters not only for consultants but for executives as well. It’s very important to make transformational changes part of it.One more vital task is to balance the following paradox. As a great leader, on the one hand, you will be expected to come up with your own vision and be able to inspire people. But on the other hand, you also need to be humble. You need to be able to listen to the outside world, listen to your people, and see the existing weaknesses in your organization to define how they can be transformed. You should find the balance between humility and vision.Wrapping upAt first glance, it may seem that digital transformation is mainly about technology. However, as Stijn Viaene explained, successful digital transformation is not only about that. It’s also about people, their mindsets, and their approach to change. By focusing on vision, talent, and strategic balance, organizations can navigate the complexities of digital transformation and position themselves for sustainable success in the digital age.If you want to find out more details of this conversation, we recommend you listen to the full version of this podcast. And to learn more about business and technology in the modern world, do not miss the next episodes of the Innovantage podcast hosted by Max Golikov.
Data Engineering
AI hype: Do you really need AI to solve all your problems?
January 21, 2025
9 min read

What is driving the hype around AI? To discuss these and many other questions, Maxim Golikov, the host of the Innovantage podcast and the CBDO at Sigli, invited AI experts to his studio. The guests of this episode were William De Prêtre, Head of AI at AllKind Group, and Artem Pochechuev, Head of Data and AI at Sigli.

In recent years, AI has maintained its position as one of the most promising and widely discussed technologies. Interestingly, it attracts the attention not only of technical experts but also of people far removed from the world of technology. Why is this happening? What is driving the hype around AI? To discuss these and many other questions, Maxim Golikov, the host of the Innovantage podcast and the CBDO at Sigli, invited AI experts to his studio. The guests of this episode were William De Prêtre, Head of AI at AllKind Group, and Artem Pochechuev, Head of Data and AI at Sigli.Both of them have been working with artificial intelligence for many years. During their careers, they have observed different stages of AI development. For this episode, they agreed to share their vision of what is happening with this technology now and what we can expect to see in the years to come. They also explained the key challenges that organizations may face when integrating AI into their solutions. These insights will be of great help to everyone who is considering the implementation of AI in their companies now or in the future.Education as the major step toward AI introductionAs both Artem and William have incredibly rich experience in working with AI projects at their companies, Maxim asked them about the most important preconditions for successful AI implementation. Their answers may seem surprising to a huge part of the podcast’s audience. Both experts mentioned that the first thing that should be done before bringing AI to people is educating them on what AI actually is. If you just ask random people about their understanding of artificial intelligence, they will say that it is ChatGPT. In reality, AI and its use cases go much beyond this.The problem is that today a lot of people who want to use AI have very limited knowledge of this technology. As a result, they can’t find the best application for it. However, using AI just because that’s AI is the wrong way.AI itself has become very efficient. But it is not necessary to apply it everywhere. A lot of solutions can work without it. According to William, if you can solve something with just your high school statistics course, then solve it with this knowledge and not with AI. This will let you use your AI resources for something that really requires AI.The term “AI” has become a powerful marketing tool. You can perfectly sell something by just saying that it has AI even if it doesn’t use this technology at all.As Artem noted, the first thing when it comes to decision-making regarding the implementation of something new should be awareness. To adopt something, to decide that you need something, to start planning something, you need to be aware of that. That’s why this education should be company-wide. Not only potential users but also decision-makers should be educated on tech-related questions.The second thing that you should focus on is the process of AI implementation. To implement this technology, you can’t avoid having tech-savvy people on board. These people should be aware of AI and be ready to go deeper and deeper into AI topics. A lot of businesses prefer to have a reliable technology partner. Or they have a choice to grow their own engineers who will be able to cope with all the required AI-related tasks. Moreover, there should be specialists who will help the company define the right purposes and priorities for their AI projects.How to introduce a new tech solutionAt the same time when you bring something new to managers and want them to let you implement some new solution, you should be ready to show them the full potential of this innovation. It is vital to explain everything in simple terms in order to let everyone fully understand your ideas. Managers do not need to know technical details. But they need to know what value they can get with the introduction of some new technologies.It doesn’t matter who will bring a new idea to the table: tech experts, business people, or external partners. What does matter is how people adopt it. How do they understand it? How quickly do they apply it to real work? How do they avoid potential risks? It doesn’t matter where exactly the implementation process starts. It is much more important how you continue with that.William also mentioned that the success of solutions often depends on the contributions of different teams. His company builds innovative products for students with different needs, like the Web2Speech extension that can read text content aloud. That’s why the success of such projects is preconditioned by the interplay between input coming from engineers, input coming from people from the education sphere, and input coming from management. There is constant interchange, which is required to have success in the AI market.Main challengeSpeaking about edtech solutions, William highlighted one very important aspect that many people can forget. Such solutions deal with children’s data. That’s why privacy laws, GDPR, the AI act, and other related regulations become very important.Intuitively, you may know that anonymizing your data is crucial. But practically, this will greatly complicate a lot of things for you.However, you can’t avoid taking care of data protection. It is really necessary because your solution will work with tons and tons of very sensitive data. And of course, you can’t let it leak because this situation will kill your reputation.The more widespread AI becomes, the more attention companies need to pay to data security and privacy protection.Unfortunately, that is something that engineers tend to overlook because they are focused on making AI perform in the right way and may forget about the value of some data for people.How to get ready for working with dataAccording to Artem, quite often, people underestimate the significance of data in AI in general. However, it plays the most important role. If there is no data, there is no AI. Without it, you can’t train your AI/ML models that grow into a large language model (LLM). You can’t train anything if you have no data. That’s why data comes first.One of the most crucial steps that are required for AI adoption is shifting to the data-centric direction. Unfortunately, that’s exactly what companies often miss.Of course, a lot of people have heard about AI but they perceive it as some kind of a jack-in-the-box that can just jump out and do everything you need. But it doesn’t work this way. AI should be trained with data before it can do anything for you.In this context, William mentioned one of his company’s projects known under its code name Bulbasaur. It is an AI tutor that can assist teachers. It can be fed with course materials. And namely, these materials and their quality will show how good your tool is. If the solution doesn’t have enough data, it will not be able to answer your question. But this situation doesn’t characterize your solution itself. It just shows that you haven’t provided it with relevant data.Without sufficient data, it simply won’t work.This principle is applied to any AI-related task. It doesn’t matter whether we are talking about predictions or clusterization. All such tasks will be performed on data.Even if you want AI to reformat your presentation, you need to feed it with your thesis, abstracts, and other presentations first.How to define your AI needs correctlyArtem explained that he usually splits the AI needs into two categories: an internal track and an external track. An internal track is all about tools that can help your employees perform their usual duties more efficiently and bring more benefits to the company. Another thing is projects that you as a company sell to your customers. Here, it’s important to understand whether you can improve your projects with AI tools.Being at the crossroads as a decision maker you need to choose which way to go. It’s vital to clearly detect the pains of potential users. This will give you an understanding of the exact tasks that your solution should perform. At the same time, you also need to talk to engineers to gather their opinions on how such tasks can be solved with tech solutions.Nevertheless, the introduction of innovations does not always go smoothly. William said that you can also face resistance to change and it’s not just because you are offering AI solutions. In his practice, there were similar cases with cloud services. When his company started moving solutions to the cloud, a lot of customers were quite confused by such a decision. Nevertheless, now people complain quite a lot about their non-cloud solutions.Given this, it’s possible to assume that at some point somebody will be not satisfied with tools that won’t have AI.You should also be ready for situations when it is not feasible to continue a project that seemed to be a good one at the beginning. It may happen because there are not enough resources for it or because it is not fully supported by your company. William advised not to throw it away but to put it aside. It may be still viable sometime later and you will be able to return to it.Practical tips for AI implementationAt the end of their discussion, Maxim asked the experts to share their recommendations with those who want to start their journey with AI.“Surround yourself with good people. Educate everybody. Find good partners,” William said. He recommended exploring all available options. “The path to heaven is clear. Go ahead and build your ladder. So even if you’re not in heaven yet, at least you can hear the angels singing,” he added.According to Artem, the best way is to grow professionals inside a company and get their expertise. He explained that today many people are ready to train you to work with AI. But in reality they just want to get easy money. That’s not what a successful education is. You need to have a decent person who is able to go deep and share the knowledge. This is the most effective way to educate people all around you and people in the company.William also highlighted the importance of industry conferences and organizations that support tech companies. Sometimes they can provide funding or help you get into contact with the right people.AI future: What is it?It’s interesting to see how people’s opinions about AI are changing over time. Initially, teachers voiced a lot of concerns about children using ChatGPT for their homework. Now some teachers in Belgium explain to high school students how various types of AI work and help them build small AI projects using off-the-shelf components. All this indicates that quite soon we will have a new generation for whom AI will be just part of their everyday life.Of course, it’s very hard to predict the future, especially in something that is moving so fast as AI. Nevertheless, it’s possible to make some general assumptions based on what is happening now.For example, according to William, there will be far more autonomous systems and self-driving cars. But he doesn’t think that they will come from Tesla as there are other car manufacturers that are already far more advanced in their autonomous technologies. Apart from this, there will be more autonomous drones used for military purposes, as well as AI personal assistance agent systems, in which small dedicated agents will work together to solve bigger problems.Nevertheless, William hopes that we won’t see more AI-generated images in the future. According to him, an AI-generated Hollywood blockbuster won’t be the best idea. He said that we should assign our boring tasks to AI, while more creative, fun work should be still performed by people.Artem added that we should perceive generative AI as a tool, not more or less.As for predictions, he also said that it is quite useless to make them. Right now, there are a lot of talks about AI hallucinations but 3 years ago we didn’t even know what it could be.That’s why when somebody is trying to invent any framework protecting us from the vicious AI of the future, it is mainly just a waste of resources. The future may turn out to be different from what we expect now.Wrapping upArtificial intelligence is a highly potential and powerful technology that, with the right approach, can help us solve many tasks of different types.However, as the experts advised, we shouldn’t use a microscope to hammer nails.Today there are plenty of things people are trying to solve with AI. But in reality, such things do not need any sophisticated approaches and can be solved much more easily.One of the core things required for successful AI adoption and implementation is comprehensive education of people on the basic questions related to this technology. It’s vital to know what AI is and how it can be used to bring benefits to your organization.The Innovantage podcast has a similar role. It helps to increase the awareness of the audience on various business and tech topics with a focus on AI and its capabilities. If you want to learn more, do not miss our next episodes!
software development agency
Rapid PoC for tech product UK

suBscribe

to our blog

Subscribe
MVP consulting firm UK
Thank you, we'll send you a new post soon!
Oops! Something went wrong while submitting the form.