blog phone

adesso BLOG

18

Tags:

  • Data management

Show all posts
Methodology

Metadata-driven data pipelines are a game changer for data processing in companies. These pipelines use metadata to dynamically update processes instead of manually revising each step every time a data source changes. As with data pipelines, metadata maintenance can be a bottleneck in the maintenance and further development of a pipeline framework. In this blog post, I use practical examples to show how the Jsonnet template language makes it easier to maintain metadata.

Read more
AI

Workflow orchestration and workflow engines are crucial components in modern data processing and software development, especially in the field of artificial intelligence (AI). These technologies make it possible to efficiently manage and coordinate various tasks and processes within complex data pipelines. In this blog post, we present Prefect, an intuitive tool for orchestrating workflows in AI development.

Read more
Methodology

Snowflake plays a prominent role in shaping the face of the industry in the ever-evolving world of data analytics and data management. This blog post looks at the development of Snowflake and why it is considered a ground-breaking solution for businesses.

Read more
Architecture

Before the introduction of computers, companies used account books, inventory lists and intuition to record key figures. The data warehouse for static reports emerged at the end of the 1980s. Digitalisation brought new challenges. Big data overwhelmed traditional data warehouses, which is why companies introduced data lakes, which also had an impact on the architecture and concepts of analytics systems. The first part of this blog post is about their development, the reasons for their emergence and the problems they solved.

Read more
Architecture

In Part 1 of the blog post, we looked at the basics of data warehouses and data lakes to understand their benefits and limitations. This part is about the evolution to a hybrid solution - the data lakehouse. From the two-tier architecture to the emergence of the lakehouse concept, this part takes us through the evolution of data structures. Learn how the lakehouse combines the strengths of data lakes and warehouses to address today's data-driven challenges.

Read more
Methodology

In an increasingly digitalised world, the systematic collection, interpretation and use of data is becoming a factor in success. If a company fails to do this, it will lose a key tool needed to stay competitive and innovate in the age of digitalisation. In my blog post, I explain why data governance is important for companies and how adesso can support them in this area.

Read more
Methodology

04.09.2023 By Mike Deecke

The ingenious data lake

Picture Mike Deecke

A data lake, also known as a data platform, is an ingenious and highly effective solution for addressing the complex challenges that come with storing, managing and analysing vast quantities of data. It provides an advanced infrastructure that enables organisations to store data in its native form and be flexible with how they process it later. In my blog post, I will go through the advantages it brings to the table.

Read more
Methodology

Apache Hudi, Iceberg and Delta Lake have proven to be powerful tools that are revolutionising the way modern data platforms handle data management and analysis. In my blog post, I will discuss the core features of these data types and highlight their respective strengths and applications.

Read more
Industries

09.06.2023 By Juan Carlos Peñafiel Suárez

Which laboratory challenges can data standards solve?

Picture Juan Carlos Peñafiel Suárez

The era of digitalisation has seen the amount of data available increase exponentially. That is why good communication between instances is still the key to managing it – and data standards play a major role in this. This blog post will explain which challenges data standards can solve.

Read more