Valiton Technology Radar

Stand: Mai 2024

In unserem Technology Radar haben wir für uns relevante Technologien und Prozesse zusammengefasst und für uns auf Basis unserer täglichen Arbeit bewertet. Dinge, die wir in neuen Projekten möglichst nicht mehr einsetzen möchten (HOLD), sind hier genauso aufgeführt wie Elemente von deren Relevanz wir auch für die Zukunft überzeugt sind (ADOPT).

Languages & Frameworks
Adopt123456789101112131415161718Trial48495051Assess63Hold67New or movedNo ChangeAdopt1920212223242526Trial5253Assess646566Hold68New or movedNo ChangeAdopt27282930313233343536373839404142Trial5455565758AssessHold69707172New or movedNo ChangeAdopt4344454647Trial59606162AssessHoldNew or movedNo Change

Languages & Frameworks


  • 1. Cube.js

    Cube.js is a platform for building analytical web applications that leverage an organization's existing database. It works with structured data sources, enabling powerful tools for transforming and pre-aggregating data. The platform is highly flexible, allowing organizations to easily integrate it into existing applications and platforms. We recommend leveraging Cube.js, as it provides a simple yet powerful way to build scalable analytical applications that can generate insights quickly and efficiently.

  • 2. Doctrine

    Our choice for object-relational mapping (ORM) library for PHP is Doctrine. It provides a powerful and flexible solution to map object-oriented code to a relational database. Its high degree of flexibility allows for easy integration with existing applications and platforms. With Doctrine, managing data and integrating with databases is reliable and efficient.

  • 3. Eslint

    For static code analysis, identifying and fixing errors, and maintaining consistent coding standards, we recommend Eslint. It is a powerful tool that ensures high-quality code, making it easy to detect issues before they become problems. Eslint is highly customizable and can be used with any JavaScript project, making it an ideal choice for developers looking for a reliable and efficient way to maintain code quality.

  • 4. FastAPI

    FastAPI is a high-performance Python web framework for building APIs quickly and efficiently. It offers a set of tools for creating scalable APIs that are easy to learn and use. FastAPI's out-of-the-box support for asynchronous programming makes it a great choice for building real-time applications. With FastAPI, building APIs that can handle high traffic loads is simple and efficient.

  • 5. Gatsby.js

    For building fast and efficient websites that are tailored to specific needs, we recommend Gatsby.js. It provides a modern web framework that is highly modular and customizable, making it a great choice for developers looking for flexibility. Based on React and GraphQL, Gatsby.js is easy to integrate with existing applications and platforms. With Gatsby.js, building modern, scalable websites is reliable and efficient.

  • 6. Huggingface Transformers

    Huggingface Transformers is an open-source library for natural language processing (NLP). It offers a wide range of pre-trained models and tools for analyzing and processing text data, making it easy to generate insights and predictions. Huggingface Transformers is easy to integrate with existing applications and platforms, providing a simple and efficient way to build NLP applications.

  • 7. Jest

    For testing React applications, we recommend Jest. It is a JavaScript testing framework that provides a reliable and efficient way to test code, ensuring high-quality code and proper functioning of React applications. Jest is easy to learn and use, making it an ideal choice for developers who are new to testing frameworks.

  • 8. LangChain

    LangChain is an advanced framework that empowers the development of applications powered by Large Language Models (LLMs). It seamlessly supports renowned LLMs such as ChatGPT, Vertex, Cohere, aleph alpha, and many others. With LangChain, developers can leverage a variety of reusable components, including prompt templates, Chat Histories Memory, and tools for incorporating results from vector searches and external sources like Google, Wikipedia, and Wolfram into prompts. These versatile components and tools can be effortlessly chained together to orchestrate the LLM pipeline, all through intuitive and standardized interfaces.

  • 9. Next.js

    Next.js is flexible JavaScript framework for creating scalable applications. It enables server-side rendering and provides an intuitive page-based routing system, making it easy to create complex, rich web experiences. Next.js is modular, allowing organizations to quickly and easily integrate custom modules, including APIs, libraries, and plugins. Its powerful development tools support popular databases, allowing for faster iteration and deployment. With Next.js, building dynamic, feature-rich applications is simple and efficient.

  • 10. Pandas

    Pandas is a Python library used for data manipulation and analysis. It provides data structures for efficiently storing and manipulating large data sets, as well as tools for data cleaning, merging, and filtering. Pandas is widely used in data science and machine learning projects due to its flexibility and ease of use. We recommend using Pandas, as it simplifies and accelerates data analysis tasks.

  • 11. PyTorch

    PyTorch is an open-source machine learning library for efficient computation and development of deep learning models. It is used for a wide range of tasks, from training models to producing complex neural networks. PyTorch's flexible architecture makes it easy to integrate with existing applications and platforms. It also has excellent support for GPUs, enabling organizations to take advantage of hardware acceleration for model training. We recommend leveraging PyTorch, as it provides a powerful, reliable, and efficient way to develop and train deep learning models.

  • 12. React

    React is a popular JavaScript library for building user interfaces. It provides a component-based architecture, making it easy to create reusable UI elements that can be used across projects. React also supports server-side rendering and has excellent performance and scalability, making it ideal for large-scale projects. We suggest using React, as it simplifies UI development and provides a flexible and scalable foundation for web applications.

  • 13. Sass

    Sass is a CSS preprocessor that allows developers to write more efficient and maintainable CSS code. It provides a range of features, such as variables, mixins, and nesting, that make it easier to write and organize CSS. Sass also includes powerful features for generating CSS, such as functions and control structures, enabling developers to create more flexible and reusable styles. We recommend using Sass, as it simplifies CSS development and improves code organization and maintainability.

  • 14. Serverless Framework

    We use the serverless framework pretty much everywhere we deploy AWS Lambda, for example. We also try to stay as provider independent as possible in the code.

  • 15. spacy

    Spacy is a free, open-source library for advanced natural language processing in Python. It's fast and efficient, with high accuracy rates. We use it extensively in our data projects, such as text classification and entity recognition.

  • 16. Symfony

    Symfony is a popular PHP web application framework used for building complex, high-performance web applications. It's been around for over a decade and has a strong community of developers supporting it. We've used it in several projects and find it reliable and easy to work with.

  • 17. Tailwind CSS

    Tailwind CSS follows the utility-first approach of Atomic Design Methodology. Unlike heavyweight frameworks like Bootstrap and Bulma, Tailwind CSS classes with built-in design system can be flexibly combined making it easy to create custom designs. Tailwind works with any JavaScript framework or even plain HTML, often reduces the amount of CSS required by a fraction and produces very small stylesheets through CSS purging (often < 10KB).

  • 18. Vue.js

    Vue.js is a progressive JavaScript framework for building user interfaces. It's lightweight and easy to integrate into existing projects. We've used it in several projects and found it to be a great choice for building dynamic, responsive user interfaces.


  • 48. Astro

    Astro is a JavaScript framework for creating web applications. Astro creates small �islands� to make your page interactive without using JavaScript everywhere. That is, it eliminates unnecessary JavaScript. For islands you can use your favorite library such as React or Vue. For the rest Astro uses purely HTML. This means that if you know HTML, you can use Astro.

  • 49. Headless CMS

    Headless CMS systems provide an alternative to traditional CMS platforms, allowing organizations to manage, store, and deliver digital content more flexibly and efficiently. They decouple the backend content repository from the frontend user experience, enabling custom publishing and integrations with software languages and content partners. We would like to evaluate multiple headless CMS tools such as Strapi to identify the right platform for our organization.

  • 50. Polars

    Polars is a DataFrame library for Python and also Rust. It is one of the fastest Panda alternatives by parallelize processing and using all available cpu cores. It shines with impressive benchmarks in performance and memory usage. It has a mature feature set, but lagging compared to panda, eg plotting graphs. Therefore, it very good alternative for automating data pipelines where processing speed is of the essence.

  • 51. Streamlit

    Streamlit is python framework to build simple web apps and UIs with only few lines of code. We use it for rapid prototyping and the creation of non-production applications for PoCs and demonstrators.


  • 63. Openobserve

    OpenObserve is a cloud native observability platform built specifically for logs, metrics, traces and analytics designed to work at petabyte scale.


  • 67. gensim

    Gensim is a framework for topic modeling and natural language processing (NLP). It is a Python library that provides a reliable and efficient way to extract insights and patterns from large text datasets. However, we put gensim to hold because it stopped active development and went to stable maintenance mode. In addition to that, most of the provided models and algorithms are not state-of-the-art any longer.



  • 19. Apache Airflow

    Apache Airflow is an open-source platform used to programmatically author, schedule, and monitor workflows. We use it heavily in most data projects, such as ETL pipelines, as it offers great flexibility and extensibility.

  • 20. Apache Kafka

    Apache Kafka is an open-source distributed event streaming platform used for building fault-tolerant architectures for real-time streaming data. It can be used to collect, store, distribute, and analyze massive amounts of data from numerous sources. Kafka is highly configurable, offering robust APIs for both producers and consumers, as well as security and scalability support. Leveraging Kafka can help us to process and manage streaming data more effectively and efficiently.

  • 21. AWS Organizations

    We use AWS Organizations to maintain and govern our AWS infrastructure and the one of our customers. It enables us to simplify plenty of everyday tasks and supports us in applying our security standards.

  • 22. AWS Sagemaker

    AWS Sagemaker is a cloud-based machine learning platform that provides a fully managed environment for training, deploying, and managing models. It enables our data scientists to quickly build, train, and deploy high-quality machine learning models without needing to manage the underlying infrastructure. SageMaker includes extensive APIs, algorithms, architectures, and tools to support every step of the machine learning process.

  • 23. dbt

    dbt is a command-line tool that enables data analysts and engineers to transform data in their warehouse more effectively. We use it mostly in combination with Snowflake to build high-quality data pipelines and models.

  • 24. Snowflake

    Snowflake is a cloud-based data storage and analytics service that we use in many of our data projects. It's often used in combination with dbt, and we have found it to be a reliable and effective technology.

  • 25. Tideways

    Tideways is a PHP performance monitoring and profiling solution that we use to identify performance bottlenecks in our PHP applications.

  • 26. Weaviate

    Weaviate is an open-source vector database. It allows you to store data objects and vector embeddings from your favorite ML-models, and scale seamlessly into billions of data objects. A vector data base builds the backbone of a vector search or a generative search.


  • 52. API Gateways

    API Gateways provide centralized control, visibility, and enhanced security for an organization's APIs. They enable a business to control who has access to its APIs, which operations a consumer can perform, and metrics like rate-limiting and throttling to prevent resources from being overloaded. Other features include traffic control, authorization, authentication, request validation, and analytics to help optimize performance. We would like to evaluate various API management and gateway solutions in order to identify the platform best suited to meet our needs

  • 53. Trino

    Fast distributed SQL query engine for big data analytics that helps you explore your data universe.


  • 64. Kserve

    KServe (previously KFServing) solves production model serving on Kubernetes. It delivers high-abstraction and performant interfaces for frameworks like Tensorflow, XGBoost, ScikitLearn, PyTorch, and ONNX.

  • 65. Opensearch

    OpenSearch is a community-driven, open-source search and analytics suite used by developers to ingest, search, visualize, and analyze data. OpenSearch consists of a data store and search engine (OpenSearch), a visualization and user interface (OpenSearch Dashboards), and a server-side data collector (Data Prepper).

  • 66. qdrant

    qdrant is an open source vector database. It promises to be very performant and to scale very good with millions of vectors. It also provides pre-filtering, support for clients in many programming languages and an intuitive web-UI to explore the content.


  • 68. Elasticsearch

    Elasticsearch is a distributed search engine that we use mainly as a RESTful wrapper around Apache Lucene. We rely on it in most of our logging infrastructure and find it to be a reliable and effective technology. We put it on hold due to latest license changes and start to evaluating alternative solutions.



  • 27. Ansible

    Ansible is an open source IT automation engine that automates provisioning, configuration management, application deployment, orchestration, and many other IT processes.

  • 28. Apache superset

    Apache Superset is an open-source software cloud-native application for data exploration and data visualization. It is capable of handling data at petabyte scale, through our experience, we have established that this technology is dependable and efficient

  • 29. DVC

    Data Version Control is a system designed specifically for machine learning projects, enabling developers to securely track, protect, share and reproduce data sets, functions, pipelines and more throughout the project lifecycle. DVC reduces onboarding time and costs, making it worth a trial for its ability to streamline data version control.

  • 30. Gitlab

    Gitlab is one of our primary tools for source code management, CI/CD and package management. We have been using the Community Edition for years and continue to be impressed by its growing feature set.

  • 31. Grafana

    Grafana is a composable platform for monitoring and observability which we use in many of our projects to monitor our infrastructure. It has consistently demonstrated its reliability and effectiveness in our experience, making it a go-to tool for us.

  • 32. Helm

    Helm is a package manager for Kubernetes and helps us manage our applications running in a Kubernetes cluster. Using Helm charts helps us to standardize workflows and reduces complexity.

  • 33. KubeClarity

    KubeClarity is a tool for detection and management of Software Bill Of Materials (SBOM) and vulnerabilities of container images and filesystems. It scans both runtime K8s clusters and CI/CD pipelines for enhanced software supply chain security.

  • 34. Kubernetes

    Kubernetes is an open-source container orchestration platform that automates the deployment, scaling, and management of containerized applications. We have found it to be a reliable and effective technology for managing our applications.

  • 35. Prometheus

    Prometheus is a software application used for event monitoring and alerting. It records real-time metrics in a time series database (allowing for high dimensionality) built using a HTTP pull model, with flexible queries and real-time alerting.

  • 36. RabbitMQ

    RabbitMQ is a message broker that enables applications to send and receive messages using a variety of messaging protocols. It supports multiple messaging patterns, including point-to-point, publish/subscribe, and request/reply. RabbitMQ is highly available and can scale to handle large amounts of messaging traffic, making it a reliable and efficient choice for message-based architectures.

  • 37. Redis

    Redis is an in-memory key-value store that can be used as a database, cache, and message broker. It supports a wide range of data structures, including strings, hashes, lists, sets, and sorted sets. Redis is highly available and provides automatic failover, making it a reliable choice for applications that require high availability. Its rich set of features and APIs make it a versatile tool for building complex applications.

  • 38. Renovate

    We use Renovate ( to update our project dependencies using pull requests. It automatically applies all applicable patches and security updates, while filtering out risky changes. This helps keeping track of outdated dependencies and minimises the work needed for updates.

  • 39. Snowplow

    Snowplow Analytics is an open-source enterprise event-level analytics platform that enables data collection from multiple platforms for advanced data analytics. At the moment we see Snowplow as the most suitable building block for data collection, in order to be able to detach from large paid analytics products.

  • 40. Solr

    Solr is a widely used, open source search platform that allows for powerful full-text search, hit highlighting, faceted search, dynamic clustering, database integration, and more. We use Solr for search and retrieval in our projects, as it provides efficient and accurate results.

  • 41. Sonarqube

    SonarQube is an open-source platform for continuous inspection of code quality. It provides metrics, code coverage, and code duplication analysis, as well as automated code reviews, which help us maintain code quality and reduce technical debt.

  • 42. Vite

    Vite is a new generation build tool for scaffolding and building modern javascript projects, with a rapidly growing popularity. It is of framework-agnostic. Vite takes advantage of the native ES modules support and esbuild for pre-bundling dependencies (on development and rollup on production). As a result, it can drastically reduce the build time compared to webpack, rollup and parcel. It also offers out-of-the-box support for TypeScript. Vite has replaced webpack in the vue-cli.


  • 54. ClickHouse

    ClickHouse is an open-source columnar database management system (DBMS) designed for high-performance analytics and data processing. It is known for its exceptional speed and scalability, making it ideal for handling large volumes of data and performing real-time analytical queries.

  • 55. DevSpace

    DevSpace is a client-only, open-source developer tool for Kubernetes. You can build, test and debug applications directly inside Kubernetes. Develop with hot reloading: updates your running containers without rebuilding images or restarting containers. Unify deployment workflows within your team and across dev, staging and production. Automate repetitive tasks for image building and deployment.

  • 56. GreatExpectations

    GreatExpectations is an open-source data quality framework to enable best practices and guardrails via automated data validation and profiling. We want to assess its suitability for our organization for efficient, cost-effective data quality.

  • 57. Karpenter

    Karpenter is an open-source node lifecycle management project built for Kubernetes. Adding Karpenter to a Kubernetes cluster can dramatically improve the efficiency and cost of running workloads on that cluster. The tool is perfectly connected with AWS and can react to AWS events such as EC2 Spot termination events so that it spins up automatically new instances up when a spot request gots canceled by AWS. This increases the stability of Spot workloads and save us money in the end because we can use Spot instances for more use-cases

  • 58. OpenTofu

    Terraform changed its license terms some time ago. Even if this does not affect us yet, OpenTofu, a fork supported by the LinuxFoundation, has emerged. Since we use Terraform intensively, we would like to understand whether the fork can be the better version to use in the long term



    • 69. Elastic Stack

      With the Elastic Stack, we can reliably and securely capture data from any source and in any format, and then search, analyze, and visualize it in real time. We put it on hold for the same reasons (License changes) why we put Elasticsearch on hold.

    • 70. Harbor

      Harbor is a cloud native registry project that enhances the open source Docker Distribution by adding security, identity, and management functionalities. It enables us to store, sign, and scan content for our projects that are not running in the cloud. We put it on hold because we are now using container and package registries in a project context. For the security scanning feature with Claire we switched to Kubeclarity.

    • 71. Rancher

      Rancher is a cloud-native platform, and it is still in a maturation process. As the platform matures, businesses might expect more reliable, secure, and cost-effective ways to deploy and scale cloud-native applications, however, until more features, tools, and resources are added, we are evaluating other possibilities for bootstrapping kubernetes clusters.

    • 72. webpack

      We have placed Webpack, a JavaScript module bundler, on hold. Webpack can be complex and resource-intensive, needing a good understanding of the project's underlying technology and structure. Poor configuration or plugin usage can lead to bloated bundles and browser incompatibility issues. Currently, we are considering alternatives like Vite to understand if they are better suited for our projects.



    • 43. Clean Code

      Clean Code is a set of principles and practices for writing code that is easy to understand, maintain, and extend. It includes naming conventions, code formatting, and the use of comments and documentation. We follow the principles of Clean Code to ensure that our code is readable, maintainable, and of high quality.

    • 44. Functional programming

      Functional Programming is a programming paradigm that emphasizes the use of pure functions, immutable data, and declarative style. We use functional programming to write code that is modular, testable, and easier to reason about. It also helps us to write code that is more resilient to change and easier to maintain.

    • 45. OpenAPI

      OpenAPI (formerly known as Swagger) is an open standard for describing APIs. It provides a way to describe the structure and functionality of APIs, which makes it easier to develop, test, and maintain them. We use OpenAPI to document our APIs, which makes it easier for developers to understand how to use them and to build applications that consume them.

    • 46. Performance Testing Frontend

      Performance testing of frontend applications is a technique that helps us to measure the speed and stability of our applications. It involves simulating real-world scenarios and measuring the performance of our applications under different loads. We use performance testing to ensure that our applications are fast, reliable, and can handle the traffic they receive.

    • 47. Pipelines as code

      The Pipeline as code technique advocates that the deployment pipeline configuration for building, testing, and deploying our applications or infrastructure should be treated as code.They should be placed under source control and best modularized into reusable components.


    • 59. Code Assistants

      There are now quite a few options for AI based support in the development process. From the commercial variant Github Copilot to the open source option GPT Code Clippy, there are several approaches. We would like to explore these and understand if and how they can improve our development processes.

    • 60. Infrastructure as code Testautomation

      We rely on IaC with Terraform in our cloud projects. The larger the infrastructures there become, the more important automated testing of the landscapes created with code becomes here as well. Terratest currently seems to be emerging as a tool for this.

    • 61. LLM Observability

      LLM observability provides tools, techniques, and methodologies to help teams manage and understand LLM application and language model performance, detect drifts or biases, and resolve issues before they have significant impact on the business or end-user experience.

    • 62. Policy as a Code

      (PaC) is a DevOps practice that enables organizations to express their security and compliance policies as easy-to-maintain code. This approach makes policies easier to manage and update, leading to more secure and reliable operations. We are therefore assessing the suitability of PaC to benefit from its ability to streamline the enforcement of policies and reduce the overhead of manual governance.