Decoding the Consultant Life: Real stories from real consultants – Olli & Smartvatten

Decoding the Consultant Life: Real stories from real consultants

Olli Alm, Senior Software Engineer, about working for our customer Smartvatten

 

I am Perttu Pakkanen and my interest as Codento’s talent acquisition lead is to better articulate why consulting could be a great career choice.

When potential customers ponder whether they should use our services, they usually like to see some reference cases. Why wouldn’t our potential employees think the same?

So, I had a chat with our Senior Software Engineer, Olli Alm. Olli has been working with our customer Smartvatten for their water efficiency technology. They are leading experts in the field in Northern Europe, helping their customers save hundreds of millions of liters of water every year.

I asked Olli to sit down with me in our office’s meeting room on a Tuesday afternoon. The weather outside was a bit rainy and dim, but it didn’t slow us down.

Here we go:

 

What kind of solution have you built?

I’ve worked on building a system for leakage monitoring and water consumption measurement, utilizing GCP infrastructure.

The client’s system is a water metering solution with two main functionalities: measuring water consumption and sending alerts, for example, in the case of leaks. What makes this particularly interesting is that it’s an IoT system, which means we’re constantly also dealing with hardware solutions and how they indirectly influence the larger system.

What I find fascinating is the challenge of keeping the whole system cohesive and building a unified architecture. In the IoT world, there’s an abundance of real-time data coming in from various countries and time zones, which raises unique challenges like efficiency, data accuracy, and synchronization.

 

The solution indeed seems like a unique one. What kind of tasks have you done in the project?

My work has been primarily focused on data-oriented back-end development, though I’ve also contributed to some front-end work.

On the social side, I interact daily with the client across multiple levels, including product owners, the CTO, and a smaller stakeholder team. The collaboration has been so close that I can confidently say that we have a good sense of trust with the client.

I’ve also been heavily involved in architectural work, which has been both challenging and rewarding.

Why I’ve enjoyed this project: The relationship with the client is very good and straightforward. There’s no unjustified pressure, communication is easy, and mutual trust makes it a pleasant working environment. I feel like an important part of the project and can see the impact of my work.

 

Those are good qualities in a working environment. What has been the most interesting thing you have done working for the customer?

One highlight has been working with time-series data and developing robust data pipelines. I also enjoyed contributing to the overall system architecture to make it more stable and efficient. Both aspects required creative problem-solving and provided a lot of professional growth.

 

What brings enthusiasm to your workdays?

There’s a constant but manageable level of pressure, which keeps things exciting without being overwhelming. Every day brings new and unexpected challenges – there’s always something fresh to tackle, which ensures I never get bored.

My days are usually quite busy, with a mix of maintenance tasks and new development work. The diversity of tasks keeps things engaging. Occasionally, there are urgent issues that require immediate attention, but that’s part of the fun.

Over time, I’ve learned so much that I can now work far more efficiently. It’s rewarding to see how much I’ve grown professionally through this experience.

 

What has been the most difficult part?

In general, learning to deal with uncertainty has been a big takeaway for me. It’s something I’ve improved at significantly during my time as a consultant.

A reflection on consulting: Consulting is like a marathon. Building a strong trust relationship with a client over the years allows the work to flow more smoothly. But occasionally, you’re faced with tough problems that take time to solve. The longer you work with the same client, the more efficient and effective you become.

 

What have you learned?

Working on a small team means the range of tasks I’ve handled has been extremely broad – from building infrastructure to fixing UI bugs. This has helped me develop a wide skill set.

Over the past three years, I’ve worked with Google Cloud Platform every single day, which has allowed me to deepen my expertise significantly. I’ve also refined my skills in routines, documentation, code quality, best practices, and multitasking, often juggling multiple responsibilities simultaneously.

It’s hard to point to one specific thing I’ve learned because the experience has been so comprehensive. However, I’d highlight Clojure as an interesting element – it’s something I’ve enjoyed exploring and using in this project.

 

That’s a lot of learning! Any last words to wrap things up?

In this project, my tasks have been very self-guided, which is a typical aspect of working on a small team. I appreciate the responsibility that comes with the role and enjoy the freedom to make my own decisions. This autonomy enhances my sense of self-efficacy and allows me to express my capabilities fully.

Codento fosters a professional environment with a strong emphasis on trust and responsibility, allowing me to thrive and deliver quality results in my daily work.

 

(We then continued to talk about different stuff not related to the subject per se and ended up using almost an hour for this. This was a great talk and I also learned a lot.)

 

 

Being part of pioneering projects like this allows for both personal and professional development. I strongly feel that at Codento you can engage in work that is not only challenging but also highly impactful in many industries.

Read more about us from our career site and see if there are any suitable opportunities for you!

 

 

About the interviewee:

Olli Alm is a Senior Software Engineer at Codento. He has 20 years of experience in different software development and architecture positions as well as teaching and research.

About the interviewer:

Perttu Pakkanen is responsible for talent acquisition at Codento. Perttu wants to make sure that the employees enjoy themselves at Codento because it makes his job much easier.

Why do data professionals prefer Google Cloud?

Why do data professionals prefer Google Cloud?

And why should you care?

Author: Juhani Takkunen, Data Engineer, Codento

Your data engineers have the challenging job of staying one step ahead of data scientists, ensuring that data is available, trustworthy and up-to-date when needed – even if it’s not needed right now. This way, your organization’s data remains ready to be turned into actionable business value and insights, whether for ad-hoc reports or data scientists’ deep-dive investigations. 

Makes sense, right? So why isn’t everyone doing this already? The simple answer: costs. 

The data platform costs can be divided into infrastructure and engineering costs, which both are quite predictable: larger data volumes require more storage and compute performance and more data sources increase the need for data engineering. While storage and platform costs have generally come down with serverless solutions especially, the data engineering effort and costs can still be significant. This unfortunately often leads to valuable data remaining untapped. 

In this post, I will explore why Google Cloud, particularly its analytics database BigQuery, is a top choice for data engineers and how it can help organizations overcome common data challenges. I will show how technical tools affect design decisions and why data professionals prefer certain tools and design patterns over others.

The data engineer’s choice: Google Cloud

Key features of a good data platform are security, ease of development and maintenance, and low cost. These are some of the reasons why top professionals working with vast amounts of data, such as researchers worldwide, prefer Google Cloud. According to the StackOverflow 2024 developer survey, again, some two out of three senior data engineers currently working with BigQuery or Google Cloud would want to continue using these technologies, while less than 20% would like to switch to alternatives.

Despite these statistics, many organizations still choose data tools based on what their businesspeople are accustomed to and like to use. This frequently leads to the adoption of Microsoft platforms like Azure or Power BI. While these tools are powerful in their familiarity with the business user, they may not align with the needs of data engineers who desire more flexibility and scalability. Just like the business people are allowed to determine what is the best tool for their work, so can and should the data team. Selecting the right tool for the task is vital for success, even if it means adhering to a multi-cloud environment. 

Data storage: BigQuery

Google Cloud suite of services includes an incredibly scalable and cost effective serverless analytics database called BigQuery. BigQuery offers highly scalable data storage that developers can access and modify by using familiar languages such as SQL or Python, regardless of the developer’s earlier background. Not all serverless solutions come with such benefits, as for example Azure Synapse Serverless does not directly support modifying data using SQL-DML statements (INSERT, UPDATE, DELETE). 

BigQuery offers benefits like high availability, unlimited storage, and scalability. Its pricing model, based on data processed rather than stored, makes it a cost-effective solution for large datasets. As storage and operations are billed separately, there is no need to ever pause the service. BigQuery can also easily be connected to any modern BI-system, such as Looker, PowerBI or Qlik. 

Pricing model based on data processed, not data stored.

The most pressing challenge for many organizations is the vast amount of unstructured data, such as text, pdfs and images, that remains untapped, as many data platforms or data platform substitutes like Excel struggle to make this data accessible for analytics.

BigQuery is optimized for machine learning (ML) tasks. Google Cloud’s acceptance among data and AI enthusiasts is evident as 70% of generative AI startups rely on Google Cloud’s AI Capabilities. This staggering number proves that the people who bet their livelihoods on data and generative AI find Google Cloud’s offering and technology most appealing. BigQuery ensures data accessibility across the organization with strict access controls, empowering employees while maintaining security. It also integrates seamlessly with other Google Cloud services, creating comprehensive data pipelines.

In early 2024, the Enterprise Strategy Group (ESG) compared the cost and features of four major cloud data warehouse solutions: BigQuery, AWS Redshift, Snowflake, and Azure Databricks SQL Serverless. They interviewed users and studied cases to build a realistic model of the three-year total cost of ownership (TCO) of these data warehouse solutions. They found that BigQuery could reduce TCO by up to 54%, offering easier operation, better flexibility, and built-in compatibility with other cloud services.

BigQuery eliminates the need to manage, monitor, and secure data warehouse infrastructure, allowing teams to focus on using insights instead of managing the process. Unlike other solutions, BigQuery is fully managed, meaning there are no physical or virtual servers to handle. It optimizes storage automatically and supports AI and machine learning work.

Data pipelines: Dataflow and Dataproc

Data pipelines often start from a simple task: load data from the source and store it in a database. One could imagine that such a repeatable, simple task can be solved using something simple like a low-/no-code solution. Unfortunately, in our experience, the data sources and scenarios are so versatile that eventually each ETL (Extract-Transform-Load) tool requires at least some custom code. Exceptions are often related to authentication, data parsing, dynamic mapping, retry-mechanisms or error handling. As simple tasks grow into more complex business problems, the simplest development tools may start to restrict the data engineers and especially maintaining the hacky solutions can be a real challenge. 

Based on our customer examples, data engineers typically prefer tooling that allows multiple developers to work simultaneously. Developers need to be able to run individual pipelines locally or in a sandbox environment, reuse code with functions, and deploy code using pull-requests and version control. The last part often turns out to be the most challenging, since a successful pull-request-review requires the reviewer to be able to both review and validate the change. 

The main ETL tools in Google Cloud are Dataflow and Dataproc, both offer serverless ETL solutions. Dataflow and Dataproc are based on the Apache open source projects Beam and Spark, respectively. With these tools, data engineers can write reusable and testable code with popular programming languages such as Python and Java. 

A lightweight, scalable data model – as a Service, if you will

BigQuery’s cost-effective pricing model and serverless nature make it an efficient and scalable tool that allows data engineers to focus on extracting insights rather than maintaining systems or managing costs. Codento, in turn, is the leading Nordic Google Cloud-focused software integrator. Our extensive Google Cloud data platform proficiency has proven that a lightweight data model built on serverless technologies like BigQuery and Python can effectively harness data from diverse sources.

Based on our earlier hands-on experiences with customers like Nordic e-commerce leader BHG and electric car charging solution pioneer, Plugit, Codento has now built an opinionated data model for our customers. Our new turnkey solution, Lightweight Data Model is scalable in terms of performance and cost, making it suitable for organizations of all sizes. The setup is pre-configured, making it ready-to-use with minimal configuration effort, typically within eight weeks from customer’s decision to proceed.

This new Data Model solution can be implemented in your existing Google Cloud environment or in a new environment, or it can also be offered as software as a Service. In the latter case, Codento manages the data platform for you in our environment. Such a turn-key solution allows you to concentrate on your business and, if you will, to continue using your existing tools on the side of the new data model.

Key takeaways:

  1. Google Cloud’s BigQuery offers scalable, serverless data storage for datasets of any size.
  2. According to surveys, data professionals prefer to work with Google Cloud and BigQuery. 
  3. Google Cloud services scale effortlessly with future requirements, such as data volume, machine learning tasks, automated testing and quality controls.

Juhani Takkunen | Codento

About the author:

Juhani Takkunen is an experienced data engineer and Python wizard. He likes building working solutions where data flows efficiently.

 

Stay tuned for more detailed information and examples of the use cases! If you need more information about specific scenarios or want to schedule a free workshop to explore the opportunities in your organization, feel free to reach out to us.

Boosting Contact Center Effectiveness with AI

Boosting Contact Center Effectiveness with AI

Conversational AI happens at competitors’ CCs while you’re busy making other plans

 

Author: Janne Flinck, Data & AI Lead

Working for Nordic organizations in various industries, I have gladly noted that front-runners are already deploying modern Artificial Intelligence tools to increase their contact center efficiency and customer satisfaction. In contrast, the majority are still looking for marginal improvements via tweaks in ticket handling or streamlining the edges of their onboarding processes.

“Life is what happens to you while you’re busy making other plans.” This familiar motto applies to many Nordic customer service and contact center decision-makers regarding conversational AI: It’s happening at competitors’ contact centers while you’re reading this blog.

 

Exceeding customer expectations while managing costs

Whether you’re a Nordic public sector entity or a private company running your business here, exceptional customer service is crucial. According to Salesforce, nearly 90% of customers today perceive the experience delivered as important as the actual products or services. Customer service leaders and marketing and sales officers face a common challenge: providing consistent, high-quality service while managing costs and resources effectively.

Nearly 90% of customers perceive the experience delivered as important as the products or services.

You want to ensure prompt, accurate responses to customers within acceptable wait times, regardless of the time of day. Simultaneously, you must balance the cost of contact center teams and onboarding new agents. You want to stay agile and be able to scale to meet the needs of growing organizations or seasonal peaks. Moreover, you want to gain insights into customer behavior and service performance to steer strategic decisions for optimizing operations and improving service quality. This is where Google’s Customer Engagement Suite comes into play.

 

Agents for agents

Generative and conversational AI agents are revolutionizing customer service, particularly in contact centers. Customer Engagement Suite is a collection of Google Cloud products designed to enhance contact center agent productivity, boost customer satisfaction, and reduce operational costs.

When your agent starts a call with a customer, Customer Engagement Suite provides live transcription, real-time answers to the customers’ questions, and a discussion summary. This helps the agent focus on customer interactions without worrying about taking notes. Customer Engagement Suite’s omnichannel support covers chat, SMS, VoIP, and video, ensuring seamless customer experiences across all channels.

Generative AI agents produce automated answers to customers’ questions by integrating to enterprise knowledge bases and other internal and external data sources. Customer Engagement Suite can also automate tasks like checking order status or updating payment details, ensuring customers always receive up-to-date information and services tailored to their needs. All this increases the efficiency of operations, and we have seen customers reduce call durations by up to 10%, yielding a significant payback to the system investment.

We have seen up to 10% reductions in call duration.

Quick access to relevant agent data will also shorten the time needed for new employee onboarding. When newcomers have speedier access to appropriate knowledge, the onboarding period can be up to 25% shorter, leading to a stark improvement in efficiency.

Many customer service calls involve tedious information-seeking, often for questions that repeat over time. Customer Engagement Suite’s virtual agent chatbots can relieve your agents of the repetitive burden by automatically finding answers to common questions using existing information sources and handling text, voice, and images in customer encounters. By reducing the need for human intervention in routine cases, the chatbots free human service agents to offer a more personal and richer interaction that increases customer and employee satisfaction.

Customer Engagement Suite offers powerful analytics tools that provide insights into customer interactions. These tools help your organization identify trends, improve processes, and make data-driven decisions.

As the Customer Engagement Suite is fully developed and managed by Google, it allows you to concentrate on extracting value for your operations. The deployments are efficient due to its seamless integration with telephony and contact center applications and tools for building custom features that adapt to your processes.

 

The Quantified Impact of AI in Contact Centers

In the bigger picture, AI will affect both new hires and existing employees in the coming years—in both negative and positive ways, depending on your position. In Metrigy’s AI for Business Success 2024-25 global research study of 697 companies, the following was discovered:

  • New hires – More than half of companies were able to reduce the number of new agents they needed to hire. The numbers are substantial: Those who did not use AI in their contact center, had to hire almost twice the number of agents during the year 2023 compared to those who used AI.
  • Existing employees – When contact centers were augmented with AI, nearly 40% of companies were able to reduce their headcount, with the average reduction being about one in every four employees.

For business leaders looking for technology to drive cost efficiencies, AI is doing its job. For example, with the addition of AI agent assist, the average handle time dropped by an average of 30%. At the same time, each supervisor saves nearly two hours per week when AI helps with scheduling and capacity planning. In addition to making agents and supervisors more efficient, AI-enabled self-service also helps automate customer interactions so that fewer of them even require live agent attention.

 

Real-world success stories in the making

I am honored to help several of our leading customers in the Nordics embrace the benefits of generative AI and conversational AI in their contact center operations. The most value can be extracted in organizations where the number of daily contacts is high, and the onboarding cost is noticeable due to complex product structures. Such fields include retail, travel and leisure, banking, and insurance. Similarly, organizations with high peak demand, such as nonprofits with surging inquiries during a fundraising campaign or public offices with specific deadlines for citizens’ input, could benefit from Customer Engagement Suite. It helps diminish the burden of agents on duty, channels routine questions directly to virtual agents, and makes onboarding seasonal employees more straightforward.

As an experienced and awarded Google Cloud Solutions integrator, Codento offers comprehensive support to ensure a smooth transition to your contact center’s era of AI. The fact that Customer Engagement Suite is a complete solution developed and managed by Google will ensure a robust platform integrated with all your relevant data sources and a foreseeable future roadmap on which to build your contact center success.

Key takeaways:

  1. The experience delivered, e.g., by your contact center agents is as important for your business as the product or service you actually sell
  2. Google has packaged Artificial Intelligence tools for excellent customer service into a managed solution called Customer Engagement Suite
  3. The efficiency effect of AI in Contact Centers has already been quantified and, e.g., handling times have been seen to drop by 30%
  4. Codento is already working with Nordic organizations to harness AI for better customer experience and more efficient Contact Center operations

 

Codento | Janne Flinck

About the author:

About the author: Janne Flinck is an AI & Data Lead at Codento. Janne joined Codento from Accenture 2022 with extensive experience in Google Cloud Platform, Data Science, and Data Engineering. His interests are in creating and architecting data-intensive applications and tooling. Janne has three professional certifications in Google Cloud and a Master’s Degree in Economics.

 

Stay tuned for more detailed information and examples of the use cases! If you need more information about specific scenarios or want to schedule a free workshop to explore the opportunities in your organization, feel free to reach out to us.

Living on the Edge – Google Kubernetes Engine makes edge computing finally real

Living on the Edge

Google Kubernetes Engine makes edge computing finally real

 

Author: Markku Tuomala, CTO 

Edge computing has been an unkept promise of 5G networks for years. Industrial companies, energy and utilities, and transportation and logistics businesses have been longing for low-latency services that would allow them to monitor and react in real-time to happenings on the field. Telecom operators, in turn, have dreamt of a genuinely novel business case for their 5G network investments, in which they would offer a scalable, cost-effective edge computing solution as a service to their customers.

Google Cloud’s packaged tools enable Edge as a Service

Edge computing is the practice of processing data closer to the source rather than relying solely on centralized cloud data centers. It offers a range of practical benefits, such as reducing latency, enhancing real-time data processing, and improving system performance. The most mentioned use cases of edge computing are real-time monitoring and control of manufacturing processes, automation of production lines, fleet management, and employee safety.

Two concepts are essential to understanding the hurdles that have been blocking the widespread use of edge computing: containerization and Kubernetes. Containerization involves packaging an application and all its dependencies into a lightweight, portable unit called a container. This allows the application to run consistently across different devices, making it ideal for deployment on edge devices with limited computing capacity. Kubernetes, in turn, acts as a management system for these containers, orchestrating their deployment, scaling, and operation to ensure they run smoothly and efficiently. Jointly, containerization and Kubernetes enable efficient, scalable, and reliable edge computing by ensuring applications can be easily deployed and managed across numerous edge locations.

Managing containerized applications with Kubernetes is a complex technological endeavor that has been a showstopper for many interesting edge computing use cases until recently. In late 2023, however, Google launched a managed service called Google Kubernetes Engine (GKE) Enterprise that will revolutionize the opportunities to offer and deploy edge computing.

Google Kubernetes Engine Enterprise for multitenant edge computing

GKE Enterprise is a tool for managing multitenant edge environments where you can cost-effectively and safely offer computing capacity from the edge for several users. These users can be the manufacturing sites of a single corporation in the same geographical area or a group of clients of a telecom operator or water or electricity company. By using GKE Enterprise, companies can efficiently manage workloads across cloud and edge environments, ensuring seamless operation and high safety availability of applications that require extremely short latency.

Chicken and egg: are use cases awaiting the technology or vice versa?

Some have claimed that edge computing is a fad, as the network connections with 30 – 60 ms latencies in the Nordics, especially, are supposedly enough for 90% of the use cases. The ambitious goal of edge computing to diminish the latency to less than ten milliseconds. This will enable some examples described above, which cannot be realized over the current networks. From my experience, I am convinced that when the appropriately priced chicken is available, the application eggs will follow in numbers. In other words, when the cost of the mature platform technology is on the right level, the game-changing use cases and applications will follow.

Aiming at <10 millisecond latencies

We at Codento have talked with more than a hundred organizations about their plans and aspirations for using artificial intelligence. Customers have delightfully novel ideas for using video surveillance connected to AI, e.g., for identifying the crossing paths of an autonomous forklift and a maintenance worker. With real-time video and a predicting AI solution, a system could reach the upcoming incident faster than a human can, potentially saving the worker’s life.

Last week, we were thrilled to introduce our first customer case in this area to the world. Telecom operator Telia and Codento have collaborated to make edge computing available to Nordic organizations through Telia’s Sirius innovation platform, with ferry operator Finferries being the first customer to pilot the service.

Edge computing transforms industries by enabling secure, low-latency, real-time data processing. For Nordic telecom operators and industrial companies, Google Kubernetes Engine Enterprise offers a powerful platform to harness its benefits.

Codento’s expert team has extensive experience with industrial customers’ businesses and processes, in-depth understanding of the AI-related use cases that Nordic companies are investigating, and awarded capabilities in Google Cloud technologies. We are eager and prepared to help your organization fully utilize edge computing and its applications. Be it a solution you want to build for your use or a platform you want to offer as a service to your customers, we are here to help.

Key takeaways:

  1. Edge computing will enable novel use cases like video monitoring and real-time reactions to events in, e.g., industrial processes
  2. Google Kubernetes Engine Enterprise is a solution enabling multitenant edge computing environments, adding scalability, cost, and security to “Edge as a Service”
  3. Codento can help industrial corporations or telecom, water or electricity companies to build use cases and services based on edge computing

 

About the author:

Markku Tuomala, CTO,  joined Codento in 2021. Markku has 25 years of experience in software development and cloud from Elisa, the leading telecom operator in Finland. Markku was responsible for Telco and IT services cloudification strategy and was a member of Elisa’s production management team. Key tasks included Elisa software strategy and operational services setup for business critical IT outsourcing. Markku drove customer oriented development and was instrumental in business growth to Elisa Viihde, Kirja, Lompakko, Self Services and Network automation. Markku also led Elisa data center operations transformation to DevOps.  

 

Stay tuned for more detailed information and examples of the use cases! If you need more information about specific scenarios or want to schedule a free workshop to explore the opportunities in your organization, feel free to reach out to us.

Breathe New Life into Cornerstone Systems

Breathe New Life into Cornerstone Systems

Take your Salesforce, SAP, Power BI, Oracle, AWS, and VMware solutions to the next level with Google Cloud

 

Author: Anthony Gyursanszky, CEO

We all want AI and analytics to boost our business and enable growth, but few of us have the deep pockets needed to redo our entire IT environment.

Most Nordic organizations have invested significantly in leading technologies like Salesforce, SAP, Microsoft Power BI, Oracle, AWS, and VMware. However, the jungle of AI capabilities is scattered and a coherent AI roadmap is difficult to envision.

Integrating Google Cloud with the technologies mentioned above, allows you to unlock new synergies and use advanced AI capabilities without extensive reconfiguration or additional capital expenditure.

 

Turbo boost your current system environment without overlapping investments

Adding Google Cloud to your IT strategy does not necessarily mean replacing existing systems. Instead, you can compliment them, enabling them to work together more effectively and deliver greater value with minimal disruption.

For example, Google Kubernetes Engine (GKE) Enterprise enables seamless deployment and management of your existing applications across hybrid and multi-cloud environments. Your Salesforce, SAP, Oracle, and VMware systems can work together more efficiently, with Google Cloud as the glue between them. The result is a more streamlined, agile IT environment that enhances the capabilities of your current investments.

Google Cloud VMware Engine, in turn, allows you to extend your existing VMware environments to Google Cloud without costly migrations or re-architecting. This enables your business to tap into Google Cloud’s vast computing and storage resources, advanced AI tools like Vertex AI machine learning platform, and robust analytics platforms like BigQuery—without a revolution in your current infrastructure.

 

Harness all your data and deploy the market-leading AI tools

Data-driven decision-making is crucial today for maintaining a competitive edge in any field of business. Integrating Google Cloud with, e.g., your existing Microsoft Power BI deployment will significantly enhance your analytics capabilities. Google Cloud’s BigQuery offers a robust, serverless data warehouse that can process vast amounts of data in real-time, providing deeper and faster insights than traditional analytics tools. By connecting BigQuery to Power BI, you can easily analyze data from various sources like SAP, Oracle, or Salesforce and visualize it in dashboards familiar to your end users. Such integration enables your teams to quickly draw informed conclusions based on comprehensive, up-to-date data without significant additional investment.

Furthermore, Google Cloud’s Vertex AI can integrate into your existing data workflows. This way, you can take advantage of Google’s advanced machine learning and predictive analytics tools, and the analysis results can be visualized and acted upon within Power BI.

You can also activate your SAP data with Google Cloud AI for advanced analytics and for building cutting-edge AI/ML and generative AI applications. This enhances the value of your data and positions your business to respond more swiftly to market changes.

For businesses using Oracle, Google Cloud’s Cross-Cloud Interconnect provides secure, high-performance connectivity between Google Cloud and Oracle Cloud Infrastructure (OCI). This allows you to continue leveraging Oracle’s strengths while benefiting from Google Cloud’s advanced AI, analytics, and compute capabilities—without being tied to a single vendor.

 

Start small, and grow compliantly as you go

One key advantage of Google Cloud is that you can start benefiting from the advanced capabilities almost immediately, driving innovation and competitive advantage with only minor incremental investments. Google Cloud’s pay-as-you-go model and flexible pricing allow you to start small, scaling up only as needed and as you gain tangible proof of the business value. This approach minimizes upfront costs while providing access to cutting-edge technologies that can accelerate your business growth.

As your business’s cloud capabilities expand, maintaining data security and compliance remains a top priority especially in the Nordic region, where regulations like GDPR are stringent. Google Cloud’s Hamina data center in Finland provides secure, EU-based infrastructure where your data stays within the region, meeting all local compliance requirements.

Google Cloud also offers advanced security features, such as Identity and Access Management (IAM), that integrate seamlessly with your existing systems like Microsoft Power BI and VMware. This ensures your data is protected across all platforms, allowing you to grow your cloud footprint securely and confidently.

 

Don’t put all your digital eggs in the same basket

Google Cloud’s open standards and commitment to interoperability ensure that you’re not locked into any single vendor, preserving your ability to adapt and evolve your IT strategy as needed. This strategic flexibility is crucial for businesses that want to maintain control over their IT destiny, avoiding the limitations and costs associated with vendor lock-in.

Google Cloud complements your existing IT investments and helps you gain a competitive edge from technology choices you have already made. At Codento, we specialize in helping Nordic businesses integrate Google Cloud into their IT strategies. We ensure that you can maximize the value of your current investments while positioning your business for future growth.

 

About the author:

Anthony Gyursanszky, CEO, joined Codento in late 2019 with more than 30 years of experience in the IT and software industry. Anthony has previously held management positions at F-Secure, SSH, Knowit / Endero, Microsoft Finland, Tellabs, Innofactor and Elisa. Hehas also served on the boards of software companies, including Arc Technology and Creanord. Anthony also works as a senior consultant for Value Mapping Services. His experience covers business management, product management, product development, software business, SaaS business, process management, and software development outsourcing. Anthony is also a certified Cloud Digital Leader.

 

Stay tuned for more detailed information and examples of the use cases! If you need more information about specific scenarios or want to schedule a free workshop to explore the opportunities in your organization, feel free to reach out to us.

Getting Your Company and Your Cloud AI-ready: Ebook to Rearchitect Your infrastructure to Unlock the Potential of AI

Getting your company and your cloud AI-ready: Ebook to rearchitect your infrastructure to unlock the potential of AI

Our partner Google Cloud created a guide for technical leaders like yourself with a roadmap to build a future-proof foundation for AI innovation. With an infrastructure that can fuel the next generation of your business, new opportunities to operationalize AI will empower teams to generate solutions to legacy challenges.

In this eBook, you will discover:

  • The infrastructure considerations that can determine AI success or failure — examining cost, scalability, security, and performance dimensions
  • Actionable strategies to evaluate AI platforms, optimize resources, and maximize the value of your AI tools
  • How and when to consider adopting managed machine learning offerings like Vertex AI and flexible container environments like Google Kubernetes Engine (GKE) to ease the operational burdens of your team
  • Best practices for leveraging specialized virtual machines (VMs) optimized for AI, including and equipped with GPUs and TPUs.

Ready to tap into the power of generative AI?​​​​​​​

 

Submit your contact information to get the report:

The Executive’s Guide to Generative AI: Kickstart Your Generative AI Journey with a 10-Step Plan 

The Executive’s Guide to Generative AI: Kickstart Your Generative AI Journey with a 10-Step Plan 

 

 

Not sure where to start with generative AI?See what your industry peers are doing and use Google Cloud’s 10-step, 30-day plan to hit the ground running with your first use case

AI’s impact will be huge. Yet right now, only 15% of businesses and IT decision makers feel they have the expert knowledge needed in this fast-moving area.This comprehensive guide will not only bring you up to speed, but help you chart a clear path forward for adopting generative AI in your business. In it, you’ll find:

  • A quick primer on generative AI.
  • A 30-day step-by-step guide to getting started.
  • KPIs to measure generative AI’s impact.
  • Industry-specific use cases and customer stories from Deutsche Bank, TIME, and more.

Dive in today to discover how generative AI can help deliver new value in your business.

 

Submit your contact information to get the report:

Get Your Copy of Google Cloud 2024 Data and AI Trends Report

Get Your Copy of Google Cloud 2024 Data and AI Trends Report

 

 

Your company is ready for generative AI. But is your data? In the AI-powered era, many organizations are scrambling to keep pace with the changes rippling across the entire data stack.

This new report from Google Cloud shares the findings from a recent survey of business and IT leaders about their goals and strategies for harnessing gen AI — and what it means for their data.

Get your copy to explore these five trends emerging from the survey:

  • Gen AI will speed the delivery of insights across organizations
  • The roles of data and AI will blur
  • Data governance weaknesses will be exposed
  • Operational data will unlock gen AI potential for enterprise apps
  • 2024 will be the year of rapid data platform modernization

 

 

 

Submit your contact information below to get the report:

Google Cloud Next’24 Top 10 Highlights of the First Day

Google Cloud Next’24 Top 10 Highlights of the First Day

 

Authors: Codento Consulting Team

 

Google Cloud Momentum Continues

The Google Cloud Next event is taking place this week in Las Vegas showcases a strong momentum with AI and Google Cloud innovations with more than 30 000 participants.

Codento is actively participating to the event in Las Vegas with Ulf Sandlund and Markku Pulkkinen and remotely via the entire Codento team. Earlier on Tuesday Codento was awarded as the Google Cloud Service Partner of the Year in Finland.

As the battle is becoming more fierce among the hyperscalers we can fairly observe that Google Cloud has taken a great position going forward:

  • Rapid growth of Google Cloud with a $36 Billion run rate outpacing its hyperscaler peers on a percentage basis
  • Continuous deep investments in AI and Gen AI progress with over a million models trained 
  • 90% of unicorns use Google Cloud showcasing a strong position with startups
  • A lot of reference stories were shared. A broad range of various industries are now using Google Cloud and its AI stack
  • And strong ecosystem momentum globally in all geographies and locally

 

Top 10 Announcements for Google Cloud Customers

Codento consultants followed every second of the first day and picked our favorite top 10 announcements based on the value to Google Cloud customers:

1. Gemini 1.5 Pro available in public preview on Vertex AI. It can now process from 128,000 tokens up to 1 million tokens. Google truly emphasizes its multi-modal capabilities. The battle against other hyperscalers in AI is becoming more fierce.

2. Gemini is being embedded across a broad range of Google Cloud services addressing a variety of use cases and becoming a true differentiator, for example:

  • New BigQuery integrations with Gemini models in Vertex AI support multimodal analytics, vector embeddings, and fine-tuning of LLMs from within BigQuery, applied to your enterprise data.
  • Gemini in Looker enables business users to chat with their enterprise data and generate visualizations and reports

3. Gemini Code Assist is a direct competitor to GitHub’s Copilot Enterprise. Code Assist can also be fine-tuned based on a company’s internal code base which is essential to match Copilot.

4. Imagen 2. Google came out with the enhanced image-generating tool embedded in Vertex AI developer platform with more of a focus on enterprise. Imagen 2 is now generally available.

5. Vertex AI Agent Builder to help companies build AI agents. This makes it possible for customers to very easily and quickly build conversational agents and instruct and guide them the same way that you do humans. To improve the quality and correctness of answers from models,  a process called grounding is used based on Google Search.

6. Gemini in Databases is a collection of AI-powered, developer-focused tools to create, monitor and migrate app databases.

7. Generative AI-powered security: number of new products and features aimed at large companies. These include Threat Intelligence, Chronicle to assist with cybersecurity investigations) and  Security Command Center.

8. Hardware announcements: Nvidia’s next-generation Blackwell platform coming to Google Cloud in early 2025 and Google Cloud joins AWS and Azure in announcing its first custom-built Arm processor, dubbed Axion

9. Run AI anywhere, generative AI search packaged solution powered by Gemma designed to help customers easily retrieve and analyze data at the edge or on-premises with GDC, this solution will be available in preview in Q2 2024.

10. Data sovereignty. Google is renewing its focus on data sovereignty with emphasis on partnerships, less to building its own sovereign clouds.

There were also a lot of new announcements in the domains of employee productivity and Chrome, but we shall leave those areas for later discussion.

Conclusions

So far the list of announcements has been truly remarkable. As we anticipate the coming days of the Next event we are eager to get deeper into the details and understand what all this means in practice.

What is already known convinces us that Google Cloud and its AI approach continues to be completely enterprise-ready providing capabilities to support deployments from pilot to production. 

To make all this real capable partners, like Codento, are needed to assist the entire journey: AI and data strategy, prioritized use cases, building the data foundation, implementing AI projects with strong grounding and integration, consider security and governance, and eventually build MLOps practices to scale the adoption.

For us partners, much anticipated news came in the form of a new specialization: Generative AI specialization will be available in June 2024. Codento is ready for this challenge with the practice and experience already in place.

To follow the Google Cloud Next 2024 event and announcements the best place is Google Cloud blog.

 

Contact us for more information on our services:

 

Introduction to AI in Business Blog Series: Unveiling the Future

Introduction to AI in Business Blog Series: Unveiling the Future

Author: Antti Pohjolainen, Codento

 

Foreword

In today’s dynamic business landscape, the integration of Artificial Intelligence (AI) has emerged as a transformative force, reshaping the way industries operate and paving the way for innovation. Companies of all sizes are implementing AI-based solutions.

AI is not just a technological leap; it’s a strategic asset, revolutionizing how businesses function, make decisions, and serve their customers.

In discussions and workshops with our customers, we have identified close to 250 different use cases for a wide range of industries. 

 

Our AI in Business Blog Series

In addition to publishing our AI.cast on-demand video production, we summarize our key learnings and insights in the “AI in Business” blog series.

This blog series will delve into the multifaceted role AI plays in reshaping business operations, customer relations, and overall software intelligence. In the following blog posts, each post has a specific viewpoint concentrating on a business need. Each perspective contains examples and customer references of innovative ways to implement AI.

In the next part – Customer Foresight – we’ll discuss how AI will provide businesses with better customer understanding based on their buying behavior, better use of various customer data, and analyzing customer feedback.

In part three – Smart Operations – we’ll look at examples of benefits customers have gained by implementing AI into their operations, including smart scheduling and supply chain optimization.

In part four – Software Intelligence – we’ll concentrate on using AI in software development.

Implementing AI to solve your business needs could provide better decision-making capabilities, increase operational efficiency, improve customer experiences, and help mitigate risks.

The potential of AI in business is vast, and these blog posts aim to illuminate the path toward leveraging AI for enhanced business growth, efficiency, and customer satisfaction. Join us in unlocking the true potential of AI in the business world.

Stay tuned for our next installment: “Customer Foresight” – Unveiling the Power of Predictive Analytics in Understanding Customer Behavior.!

 

 

About the author: Antti  “Apo” Pohjolainen, Vice President, Sales, joined Codento in 2020. Antti has led Innofactor’s (Nordic Microsoft IT provider) sales organization in Finland and, prior to that, worked in leadership roles in Microsoft for the Public sector in Finland and Central & Eastern Europe. Apo has been working in different sales roles longer than he can remember. He gets a “seller’s high” when meeting with customers and finding solutions that provide value for all parties involved. Apo received his MBA from the University of Northampton. His final business research study dealt with Multi-Cloud. Apo has frequently lectured about AI in Business at the Haaga-Helia University of Applied Sciences.  

 

 

Follow us and subscribe to our AI.cast to keep yourself up-to-date regading the recent AI developments:

Google Cloud Nordic Summit 2023: Three Essential Technical Takeaways

Google Cloud Nordic Summit 2023: Three Essential Technical Takeaways

Authors, Jari Timonen, Janne Flinck, Google Bard

Codento  participated with a team of six members in the Google Cloud Nordic Summit on 19-20 September 2023, where we had the opportunity to learn about the latest trends and developments in cloud computing.

In this blog post, we will share some of the key technical takeaways from the conference, from a developer’s perspective.

 

Enterprise-class Generative AI for Large Scale Implementtation

One of the most exciting topics at the conference was Generative AI (GenAI). GenAI is a type of artificial intelligence that can create new content, such as text, code, images, and music. GenAI is still in its early stages of development, but it has the potential to revolutionize many industries.

At the conference, Google Cloud announced that its GenAI toolset is ready for larger scale implementations. This is a significant milestone, as it means that GenAI is no longer just a research project, but a technology that 

can be used to solve real-world problems.

One of the key differentiators of Google Cloud’s GenAI technologies is their focus on scalability and reliability. Google Cloud has a long track record of running large-scale AI workloads, and it is bringing this expertise to the GenAI space. This makes Google Cloud a good choice for companies that are looking to implement GenAI at scale.

 

Cloud Run Helps Developers to Focus on Writing Code

Another topic that was covered extensively at the conference was Cloud Run. Cloud Run is a serverless computing platform that allows developers to run their code without having to manage servers or infrastructure. Cloud Run is a simple and cost-effective way to deploy and manage web applications, microservices, and event-driven workloads.

One of the key benefits of Cloud Run is that it is easy to use. Developers can deploy their code to Cloud Run with a single command, and Google Cloud will manage the rest. This frees up developers to focus on writing code, rather than managing infrastructure.

Google just released Direct VPC egress functionality to Cloud Run. It lowers the latency and increases throughput  for connections to your VPC network. It is more cost effective than serverless VPC connectors which used to be the only way to connect your VPC to Cloud Run.

Another benefit of Cloud Run is that it is cost-effective. Developers only pay for the resources that their code consumes, and there are no upfront costs or long-term commitments. This makes Cloud Run a good choice for all companies.

 

Site Reliability Engineering (SRE) Increases Customer Satisfaction

Site Reliability Engineering (SRE) is a discipline that combines software engineering and systems engineering to ensure the reliability and performance of software systems. SRE is becoming increasingly important as companies rely more and more on cloud-based applications.

At the conference, Google Cloud emphasized the importance of SRE for current and future software teams and companies. 

One of the key benefits of SRE is that it can help companies improve the reliability and performance of their software systems. This can lead to reduced downtime, improved customer satisfaction, and increased revenue.

Another benefit of SRE is that it can help companies reduce the cost of operating their software systems. SRE teams can help companies identify and eliminate waste, and they can also help companies optimize their infrastructure.

 

Conclusions

The Google Cloud Nordic Summit was a great opportunity to learn about the latest trends and developments in cloud computing. We were particularly impressed with Google Cloud’s GenAI toolset and Cloud

 Run platform. We believe that these technologies have the potential to revolutionize the way that software is developed and deployed.

We were also super happy

that Codento was awarded with the Partner Impact 2023 Recognition in Finland by Google Cloud Nordic team. Codento received praise for deep expertise in Google Cloud services and market impact, impressive NPS score, and  achievement of the second Google Cloud specialization.

 

 

 

 

 

About the Authors

Jari Timonen, is an experienced software professional with more than 20 years of experience in the IT field. Jari’s passion is to build bridges between the business and the technical teams, where he has worked in his previous position at Cargotec, for example. At Codento, he is at his element in piloting customers towards future-compatible cloud and hybrid cloud environments.

Janne Flinck is an AI & Data Lead at Codento. Janne joined Codento from Accenture 2022 with extensive experience in Google Cloud Platform, Data Science, and Data Engineering. His interests are in creating and architecting data-intensive applications and tooling. Janne has three professional certifications and one associate certification in Google Cloud and a Master’s Degree in Economics.

Bard is a conversational generative artificial intelligence chatbot developed by Google, based initially on the LaMDA family of large language models (LLMs) and later the PaLM LLM. It was developed as a direct response to the rise of OpenAI’s ChatGPT, and was released in a limited capacity in March 2023 to lukewarm responses, before expanding to other countries in May.

 

Contact us for more information about our Google Cloud capabilities:

100 Customer Conversations Shaped Our New AI and Apps Service Offering 

100 Customer Conversations Shaped Our New AI and Apps Service Offering 

 

Author: Anthony Gyursanszky, CEO, Codento

 

Foreword

A few months back, in a manufacturing industry event: Codento  just finished our keynote together with Google and our people started mingling among the audience. Our target was to agree on a follow-up discussions about how to utilize Artificial Intelligence (AI) and modern applications for their business.

The outcome of that mingling session was staggering. 50% of the people we talked with wanted to continue the dialogue with us after the event. The hit rate was not 10%, not 15%, but 50%. 

We knew before already that AI will change everything, but with this, our  confidence climbed to another level . Not because we believed in this, but because we realized that so many others did, too.

AI will change the way we serve customers and manufacture things, the way we diagnose and treat illnesses, the way we travel and commute, and the way we learn. AI is everywhere, and not surprisingly, it is also the most common topic that gets executives excited and interested in talking. 

AI does not solve the use cases without application innovations. Applications  integrate the algorithms to an existing operating environment, they provide required user interfaces, and  they handle the orchestration in a more complex setup.

 

We address your industry- and role-specific needs with AI and application innovations 

We at Codento have been working with AI and Apps for several years now. Some years back, we also sharpened our strategy to be the partner of choice in Finland for Google Cloud Platform-based solutions in the AI and applications innovation space. 

During the past six months, we have been on a mission to workshop with as many organizations as possible about their needs and aspirations for AI and Apps. This mission has led us to more than a hundred discussions with dozens and dozens of people from the manufacturing industry to retail and healthcare to public services.

Based on these dialogues, we concluded that it is time for Codento to move from generic technology talks to more specific messages that speak the language of our customers. 

Thus, we are thrilled to introduce our new service portfolio, shaped by those extensive conversations with various organizations’ business, operations, development, and technology experts.

Tailored precisely to address your industry and role-specific requirements, we now promise you more transparent customer foresight, smarter operations, and increased software intelligence – all built on a future-proof, next-generation foundation on Google Cloud. 

These four solution areas will form the pillars of Codento’s future business. Here we go.

 

AI and Apps for Customer Foresight

As we engaged with sales, marketing and customer services officers we learned that the majority is stuck with limited visibility of customer understanding and of the impact their decisions and actions have on their bottom line. AI and Apps can change all this.

For example, with almost three out of four online shoppers expecting brands to understand their unique needs, the time of flying blind on marketing, sales, and customer service is over.

Codento’s Customer Foresight offering is your key to thriving in tomorrow’s markets.  

  • Use data and Google’s innovative tech, trained on the world’s most enormous public datasets, to find the right opportunities, spot customers’ needs, discover new markets, and boost sales with more intelligent marketing. 
  • Exceed your customers’ expectations by elevating your retention game with great experiences based on new technology. Keep customers returning by foreseeing their desires and giving them what they want when and how they want it – even before they realize their needs themselves. 
  • Optimize Your Profits with precise data-driven decisions based on discovering your customers’ value with Google’s ready templates for calculating Customer Lifetime Value. With that, you can focus on the best customers, make products that sell, and set prices that work. 

 

AI and Apps for Smart Operations 

BCG has stated that 89% of industrial companies plan to implement AI in their production networks. As we have been discussing with the operations, logistics and supply chain directors, we have seen this to be true – the appetite is there.

Our renewed Smart Operations offering is your path to operational excellence and increased resilience. You should not leave this potential untapped in your organization. 

  • By smart scheduling your operations, we will help streamline your factory, logistics, projects, and supply chain operations. With the help of Google’s extensive AI tools for manufacturing and logistics operations, you can deliver on time, within budget, and with superior efficiency. 
  • Minimize risks related to disruptions, protect your reputation, and save resources, thereby boosting employee and customer satisfaction while cutting costs.  
  • Stay one step ahead with the power of AI, transparent data, and analytics. Smart Operations keeps you in the know, enabling you to foresee and tackle disruptions before they even happen. 

 

AI and Apps for Software Intelligence 

For the product development executives of software companies, Codento offers tools and resources for unleashing innovation. The time to start benefiting from AI in software development is now. 

Gartner predicts that 15% of new applications will be automatically generated by AI in the year 2027 – that is, without any interaction with a human. As a whopping 70% of the world’s generative AI startups already rely on Google Cloud’s AI capabilities, we want to help your development organization do the same. 

  • Codento’s support for building an AI-driven software strategy will help you confidently chart your journey. You can rely on Google’s strong product vision and our expertise in harnessing the platform’s AI potential. 
  • Supercharge your software development and accelerate your market entry with cutting-edge AI-powered development tools. With Codento’s experts, your teams can embrace state-of-the-art DevOps capabilities and Google’s cloud-native application architecture. 
  • When your resources fall short, you can scale efficiently by complementing your development capacity with our AI and app experts. Whether it’s Minimum Viable Products, rapid scaling, or continuous operations, we’ve got your back. 

 

Nextgen Foundation to enable AI and Apps

While the business teams are moving ahead with AI and App  initiatives related to Customer Foresight, Smart Operations, and Software Intelligence   IT functions are often bound to legacy IT and data  architectures and application portfolios. This creates pressure for the IT departments to keep up with the pace.

All the above-mentioned comes down to having the proper foundation to build on, i.e., preparing your business for the innovations that AI and application technologies can bring. Moving to a modern cloud platform will allow you to harness the potential of AI and modern applications, but it is also a cost-cutting endeavor.BCG has studied companies that are forerunners in digital and concluded that they can save up to 30% on their IT costs when moving applications and infrastructure to the cloud. 

  • Future-proof your architecture and operations with Google’s secure, compliant, and cost-efficient cloud platform that will scale to whatever comes next. Whether you choose a single cloud strategy or embrace multi-cloud environments, Codento has got you covered. 
  • You can unleash the power and amplify the value of your data through real-time availability, sustainable management, and AI readiness. With Machine Learning Ops (MLOps), we streamline your organization’s scaling of AI usage. 
  • We can also help modernize your dated application portfolio with cloud-native applications designed for scale, elasticity, resiliency, and flexibility. 

 

Sharpened messages wing Codento’s entry to the Nordic market 

With these four solution areas, we aim to discover the solutions to your business challenges quickly and efficiently. We break the barriers between business and technology with our offerings that speak the language of the target person. We are dedicated to consistently delivering solutions that meet your needs and learn and become even more efficient over time.  

Simultaneously, we eagerly plan to launch Codento’s services and solutions to the Nordic market. Our goal is to guarantee that our customers across the Nordics can seize the endless benefits of Google’s cutting-edge AI and application technologies without missing a beat.

About the author:

Anthony Gyursanszky, CEO, joined Codento in late 2019 with more than 30 years of experience in the IT and software industry. Anthony has previously held management positions at F-Secure, SSH, Knowit / Endero, Microsoft Finland, Tellabs, Innofactor and Elisa. Hehas also served on the boards of software companies, including Arc Technology and Creanord. Anthony also works as a senior consultant for Value Mapping Services. His experience covers business management, product management, product development, software business, SaaS business, process management, and software development outsourcing. Anthony is also a certified Cloud Digital Leader.

 

Contact us for more information on our services:

 

Leading through Digital Turmoil

Leading through Digital Turmoil

Author: Anthony Gyursanszky, CEO, Codento

 

Foreword

Few decades back during my early university years I bacame familiar with Pascal coding and Michael Porter’s competitive strategy. “Select telecommunication courses next – it is the future”,  I was told. So I did, and the telecommunications disruption indeed accelerated my first career years.

The telecom disruption layed up the foundation for an even greater change we are now facing enabled by cloud capabilities, data technologoes, artificial intelligence and modern software. We see companies not only selecting between Porter’s lowest cost, differentation, or focus strategies, but with the help of digital disruption, the leaders utilize them all simultaneously.

Here at Codento we are in a mission to help various organization to succeed through digital turmoil, understand their current capabilities, envision their future business and technical environment, craft the most rational steps of transformation towards digital leadership, and support them throughout this process with advise and capability acceleration. In this process, we work closely with leading cloud technology enablers, like Google Cloud.

In this article, I will open up the journey towards digital leadership based on our experiences and available global studies.

 

What we mean by digital transformation now?

Blair Franklin, Contributing Writer, Google Cloud recently published a blogpost

Why the meaning of “digital transformation” is evolving. Google interviewed more than 2,100 global tech and business leaders around the question: “What does digital transformation mean to you?”

Five years ago the dominant view was “lift-and-shift” your IT infrastructure to the public cloud. Most organizations have now proceedded with this, mostly to seek for cost saving, but very little transformative business value has been visible to their own customers.

Today, the meaning of “digital transformation “has expanded according to Google Cloud survey. 72% consider it as much more than “lift-and-shift”. The survey claims that there are now two new attributes:

  1. Optimizing processes and becoming more operationally agile (47%). This in my opinion,  provides a foundation for both cost and differentiation strategy.
  2. Improving customer experience through technology (40%). This, in my opinion, boosts both focus and differentiation strategy.

In conclusion, we have now moved from “lift-and-shift” era to a “digital leader” era.

 

Why would one consider becoming a digital leader?

Boston Consulting Group and Google Cloud explored the benefits of putting effort on becoming “a digital leader” in Keys of Scaling Digital Value 2022 study. According to the study, about 30% of organizations were categorized as digital leaders. 

And what is truly interesting, digital leaders tend to outperform their peers: They bring 2x more solutions to scale and with scaling they deliver significantly better financial results (3x higher returns on investments, 15-20% faster revenue growth and simlar size of cost savings)

The study points out several characteristics of a digital leader, but one with the highest correlation is related how they utilize software in the cloud:  digital leaders deploy cloud-native solutions (64% vs. 3% of laggers) with modern modular architecture (94% vs. 21% laggers).

Cloud native means a concept of building and running applications to take advantage of the distributed computing offered by the cloud. Cloud native applications, on the other hand, are designed to utilize the scale, elasticity, resiliency, and flexibility of the cloud.

The opposite to this are legacy applications which have been designed to on-premises environments, bound to certain technologies, integrations, and even specific operating system and database versions.

 

How to to become a digital leader?

First, It is obvious that the journey towards digital leadership requires strong vision, determination, and investments as there are two essential reasons why the progress might be stalled:

  • According to a Mckinsey survey a lack of strategic clarity cause transformations to lose momentum or stall at the pilot stage.
  • Boston Consulting Group research found that only 40% of all companies manage to create an integrated transformation strategy. 

Second, Boston Consulting Group and Google Cloud “Keys of Scaling Digital Value 2022” study further pinpoints a more novel approach for digital leadership as a prerequisite for success. The study shows that the digital leaders:

  • Are organized around product-led platform teams (83% leaders vs. 25% laggers)
  • Staff cross-functional lighthouse teams (88% leaders vs. 23% laggers)
  • Establish a digital “control tower” (59% leaders vs. 4% laggers)

Third, as observed by us also here at Codento, most companies have structured their organizations and defined roles and process during the initial IT era into silos as they initially started to automate their manual processes with IT technologies  and applications. They added IT organizations next to their existing functions while keeping business and R&D functions separate.

All these three key functions have had their own mostly independent views of data, applications and cloud adoption, but while cloud enables and also requires seemless utilization of these capabilities ”as one”, companies need to rethink the way they organize themselves in a cloud-native way.

Without legacy investments this would obviously be a much easier process as “digital native” organizations, like Spotify, have showcased. Digital natives tend to design their operations ”free of silos” around cloud native application development and utilizing advanced cloud capabilities like unified data storage, processing and artificial intelligence.

Digital native organizations are flatter, nimbler, and roles are more flexible with broader accountability ss suggested by DevOps and Site Reliability Engineering models. Quite remarkable results follow successful adoption. DORA’s, 2021 Accelerate: State of DevOps Report reveals that peak performers in this area are 1.8 times more likely to report better business outcomes.

 

Yes, I want to jump to a digital leadr train. How to get started?

In summary, digital leaders are more successful than their peers and it is difficult to argument not to join that movement.

Digital leaders do not only consider digital transformation as an infrastructure cloudification initiative, but seek competitive egde by optimizing processes and improving customer experience. To become a digital leader requires a clear vision, support by top management and new structures enabled by cloud native applications accelerated by integrated data and artificial intelligence. 

We here at Codento are specialized in enabling our customers to become digital leaders with a three-phase-value discovery approach to crystallize your:

  1. Why? Assess where you are ar the moment and what is needed to flourish in the future business environment.
  2. What? Choose your strategic elements and target capabilities in order to succeed.
  3. How? Build and implement your transformation and execution journeys based on previous phases.

We help our clients not only throughout the entire thinking and implementation process, but also with specific improvement initiatives as needed.

To get more practical perspective on this you may want to visit our live digital leader showcase library:

You can also subscribe to our newsletters, join upcoming online-events and watch our event recordings

 

About the author: Anthony Gyursanszky, CEO, joined Codento in late 2019 with more than 30 years of experience in the IT and software industry. Anthony has previously held management positions at F-Secure, SSH, Knowit / Endero, Microsoft Finland, Tellabs, Innofactor and Elisa. Gyursanszky has also served on the boards of software companies, including Arc Technology and Creanord. Anthony also works as a senior consultant for Value Mapping Services. Anthony’s experience covers business management, product management, product development, software business, SaaS business, process management and software development outsourcing. Anthony is also a certified Cloud Digital Leader.

 

Contact us for more information on our  Value Discovery services.

Codento Community Blog: Six Pitfalls of Digitalization – and How to Avoid Them

Codento Community Blog: Six Pitfalls of Digitalization – and How to Avoid Them

By Codento consultants

 

Introduction

We at Codento have been working hard over the last few months on various digitization projects as consultants and have faced dozens of different customer situations. At the same time, we have stopped to see how much of the same pitfalls are encountered at these sites that could have been avoided in advance.

The life mission of a consulting firm like Codento is likely to provide a two-pronged vision for our clients: to replicate the successes generally observed and, on the other hand, to avoid pitfalls.

Drifting into avoidable repetitive pitfalls always causes a lot of disappointment and frustration, so we stopped against the entire Codento team of consultants to reflect and put together our own ideas, especially to avoid these pitfalls.

A lively and multifaceted communal exchange of ideas was born, which, based on our own experience and vision, was condensed into six root causes and wholes:

  1. Let’s start by solving the wrong problem
  2. Remaining bound to existing applications and infrastructure
  3. Being stuck with the current operating models and processes
  4. The potential of new cloud technologies is not being optimally exploited
  5. Data is not sufficiently utilized in business
  6. The utilization of machine learning and artificial intelligence does not lead to a competitive advantage

Next, we will go through this interesting dialogue with Codento consultants.

 

Pitfall 1: Let’s start by solving the originally wrong problem

How many Design Sprints and MVPs in the world have been implemented to create new solutions in such a way that the original problem setting and customer needs were based on false assumptions or otherwise incomplete?

Or that many problems more valuable to the business have remained unresolved when they are left in the backlog? Choosing a technology between a manufactured product or custom software, for example, is often the easiest step.

There is nothing wrong with the Design Sprint or Minimum Viable Product methodology per se: they are very well suited to uncertainty and an experimental approach and to avoid unnecessary productive work, but there is certainly room for improvement in what problems they apply to.

Veera also recalls one situation: “Let’s start solving the problem in an MVP-minded way without thinking very far about how the app should work in different use cases. The application can become a collection of different special cases and the connecting factor between them is missing. Later, major renovations may be required when the original architecture or data model does not go far enough. ”

Markku smoothly lists the typical problems associated with the conceptualization and MVP phase: “A certain rigidity in rapid and continuous experimentation, a tendency to perfection, a misunderstanding of the end customer, the wrong technology or operating model.”

“My own solution is always to reduce the definition of a problem to such a small sub-problem that it is faster to solve and more effective to learn. At the same time, the positive mood grows when something visible is always achieved, ”adds Anthony.

Toni sees three essential steps as a solution: “A lot of different problem candidates are needed. One of them will be selected for clarification on the basis of common criteria. Work on problem definition both extensively and deeply. Only then should you go to Design Sprint. ”

 

Pitfall 2: Trapped with existing applications and infrastructure

It’s easy in “greenfield” projects where the “table is clean,” but what to do when the dusty application and IT environment of the years is an obstacle to ambitious digital vision?

Olli-Pekka starts: “Software is not ready until it is taken out of production. Until then, more or less money will sink in, which would be nice to get back, either in terms of working time saved, or just as income. If the systems in production are not kept on track, then the costs that will sink into them are guaranteed to surpass the benefits sooner or later. This is due to inflation and the exponential development of technology. ”

“A really old system that supports a company’s business and is virtually impossible to replace,” continues Jari T. “The low turnover and technology age of it means that the system is not worth replacing. The system will be shut down as soon as the last parts of the business have been phased out. ”

“A monolithic system comes to mind that cannot be renewed part by part. Renewing the entire system would be too much of a cost, ”adds Veera.

Olli-Pekka outlines three different situations: “Depending on the user base, the pressures for modernization are different, but the need for it will not disappear at any stage. Let’s take a few examples.

Consumer products – There is no market for antiques in this industry unless your business is based on the sale of NFTs from Doom’s original source code, and even then. Or when was the last time you admired Win-XP CDs on a store shelf?

Business products – a slightly more complicated case. The point here is that in order for the system you use to be relevant to your business, it needs to play kindly with other systems your organization uses. Otherwise, a replacement will be drawn for it, because manual steps in the process are both expensive and error-prone. However, there is no problem if no one updates their products. I would not lull myself into this.

Internal use – no need to modernize? All you have to do here is train yourself to replace the new ones, because no one else is doing it to your stack anymore. Also, remember to hope that not everyone who manages to entice you into this technological impasse will come up with a peek over the fence. And also remember to set aside a little extra funds for maintenance contracts, as outside vendors may raise their prices when the number of users for their sunset products drops. ”

A few concepts immediately came to mind by Iiro: “Path dependency and Sunk cost fallacy. Could one write own blog about both of them? ”

“What are the reasons or inconveniences for different studies?” ask Sami and Marika.

“I have at least remembered the budgetary challenges, the complexity of the environments, the lack of integration capacity, data security and legislation. So what would be the solution? ”Anthony answers at the same time.

Olli-Pekka’s three ideas emerge quickly: “Map your system – you should also use external pairs of eyes for this, because they know how to identify even the details that your own eye is already used to. An external expert can also ask the right questions and fish for the answers. Plan your route out of the trap – less often you should rush blindly in every direction at the same time. It is enough to pierce the opening where the fence is weakest. From here you can then start expanding and building new pastures at a pace that suits you. Invest in know-how – the easiest way to make a hole in a fence is with the right tools. And a skilled worker will pierce the opening so that it will continue to be easy to pass through without tearing his clothes. It is not worth lulling yourself to find this factor inside the house, because if that were the case, that opening would already be in it. Or the process rots. In any case, help is needed. ”

 

Pitfall 3: Remaining captive to current policies

“Which is the bigger obstacle in the end: infrastructure and applications or our own operating models and lack of capacity for change?”, Tommi ponders.

“I would be leaning towards operating models myself,” Samuel sees. “I am strongly reminded of the silo between business and IT, the high level of risk aversion, the lack of resilience, the vagueness of the guiding digital vision, and the lack of vision.”

Veera adds, “Let’s start modeling old processes as they are for a new application, instead of thinking about how to change the processes and benefit from better processes at the same time.”

Elmo immediately lists a few practical examples: “Word + Sharepoint documentation is limiting because “this is always the case”. Resistance to change means that modern practices and the latest tools cannot be used, thereby excluding some of the contribution from being made. This limits the user base, as it is not possible to use the organisation’s cross-border expertise. ”

Anne continues: “Excel + word documentation models result in information that is widespread and difficult to maintain. The flow of information by e-mail. The biggest obstacle is culture and the way we do it, not the technology itself. ”

“What should I do and where can I get motivation?” Perttu ponders and continues with the proposed solution: “Small profits quickly – low-hanging-fruits should be picked. The longer the inefficient operation lasts, the more expensive it is to get out of there. Sunk Cost Fallacy could be loosely combined with this. ”

“There are limitless areas to improve.” Markku opens a range of options: “Business collaboration, product management, application development, DevOps, testing, integration, outsourcing, further development, management, resourcing, subcontracting, tools, processes, documentation, metrics. There is no need to be world-class in everything, but it is good to improve the area or areas that have the greatest impact with optimal investment. ”

 

Pitfall 4: The potential of new cloud technologies is not being exploited

Google Cloud, Azure, AWS or multi-cloud? Is this the most important question?

Markku answers: “I don’t think so. The indicators of financial control move cloud costs away from the depreciation side directly higher up the lines of the income statement, and the target setting of many companies does not bend to this, although in reality it would have a much positive effect on cash flow in the long run. ”

Sanna comes to mind a few new situations: “Choose the technology that is believed to best suit your needs. This is because there is not enough comprehensive knowledge and experience about existing technologies and their potential. Therefore, one may end up with a situation where a lot of logic and features have already been built on top of the chosen technology when it is found that another model would have been better suited to the use case. Real-life experience: “With these functions, this can be done quickly”, two years later: “Why wasn’t the IoT hub chosen?”

Perttu emphasizes: “The use of digital platforms at work (eg drive, meet, teams, etc.) can be found closer to everyday business than in the cold and technical core of cloud technology. Especially as the public debate has recently revolved around the guidelines of a few big companies instructing employees to return to local work. ”

Perttu continues: “Compared to this, the services offered by digital platforms make operations more agile and enable a wider range of lifestyles, as well as streamlining business operations. It must be remembered, of course, that physical encounters are also important to people, but it could be assumed that experts in any field are best at defining effective ways of working themselves. Win-win, right? ”

So what’s the solution?

“I think the most important thing is that the features to be deployed in the cloud capabilities are adapted to the selected short- and long-term use cases,” concludes Markku.

 

Pitfall 5: Data is not sufficiently utilized in business

Aren’t there just companies that can avoid having the bulk of their data in good possession and integrity? But what are the different challenges involved?

Aleksi explains: “The practical obstacle to the wider use of data in an organization is quite often the poor visibility of the available data. There may be many hidden data sets whose existence is known to only a couple of people. These may only be found by chance by talking to the right people.

Another similar problem is that for some data sets, the content, structure, origin or mode of origin of the data is no longer really known – and there is little documentation of it. ”

Aleksi continues, “An overly absolute and early-applied business case approach prevents data from being exploited in experiments and development involving a“ research aspect ”. This is the case, for example, in many new cases of machine learning: it is not clear in advance what can be expected, or even if anything usable can be achieved. Thus, such early action is difficult to justify using a normal business case.

It could be better to assess the potential benefits that the approach could have if successful. If these benefits are large enough, you can start experimenting, look at the situation constantly, and snatch ideas that turn out to be bad quickly. The time of the business case may be later. ”

 

Pitfall 6: The use of machine learning and artificial intelligence will not lead to a competitive advantage

It seems to be fashionable in modern times for a business manager to attend various machine learning courses and a varying number of experiments are underway in organizations. However, it is not very far yet, is it?

Aleksi opens his experiences: “Over time, the current“ traditional ”approach has been filed quite well, and there is very little potential for improvement. The first experiments in machine learning do not produce a better result than at present, so it is decided to stop examining and developing them. In many cases, however, the situation may be that the potential of the current operating model has been almost completely exhausted over time, while on the machine learning side the potential for improvement would reach a much higher level. It is as if we are locked in the current way only because the first attempts did not immediately bring about improvement. ”

Anthony summarizes the challenges into three components: “Business value is unclear, data is not available and there is not enough expertise to utilize machine learning.”

Jari R. wants to promote his own previous speech at the spring business-oriented online machine learning event. “If I remember correctly, I have compiled a list of as many as ten pitfalls suitable for this topic. In this event material, they are easy to read:

  1. The specific business problem is not properly defined.
  2. No target is defined for model reliability or the target is unrealistic.
  3. The choice of data sources is left to data scientists and engineers and the expertise of the business area’s experts is not utilized.
  4. The ML project is carried out exclusively by the IT department itself. Experts from the business area will not be involved in the project.
  5. The data needed to build and utilize the model is considered fragmented across different systems, and cloud platform data solutions are not utilized.
  6. The retraining of the model in the cloud platform is not taken into account already in the development phase.
  7. The most fashionable algorithms are chosen for the model. The appropriateness of the algorithms is not considered.
  8. The root causes of the errors made by the model are not analyzed but blindly rely on statistical accuracy parameters.
  9. The model will be built to run on Data Scientist’s own machine and its portability to the cloud platform will not be considered during the development phase.
  10. The ability of the model to analyze real business data is not systematically monitored and the model is not retrained. ”

This would serve as a good example of the thoroughness of our data scientists. It is easy to agree with that list and believe that we at Codento have a vision for avoiding pitfalls in this area as well.

 

Summary – Avoid pitfalls in a timely manner

To prevent you from falling into the pitfalls, Codento consultants have promised to offer two-hour free workshops to willing organizations, always focusing on one of these pitfalls at a time:

  1. Digital Value Workshop: Clarified and understandable business problem to be solved in the concept phase
  2. Application Renewal Workshop: A prioritized roadmap for modernizing applications
  3. Process Workshop: Identifying potential policy challenges for the evaluation phase
  4. Cloud Architecture Workshop: Helps identify concrete steps toward high-quality cloud architecture and its further development
  5. Data Architecture Workshop: Preliminary current situation of data architecture and potential developments for further design
  6. Artificial Intelligence Workshop: Prioritized use case descriptions for more detailed planning from a business feasibility perspective

Ask us for more information and we will make an appointment for August, so the autumn will start comfortably, avoiding the pitfalls.