Data Architect with flair for AI

ti&m stands for technology, innovation & management. We are leader for digitisation, security and innovation projects and products in Switzerland and we are striving to do the same in other financial and technology centres. We offer our discerning clients vertical integration throughout the IT value chain. At our offices in Zurich, Bern, Frankfurt and Singapore, we currently employ over 355 outstanding engineers, designers, and consultants. And further offices in Europe will follow. The basis of our growth lies in our strengths and values: courage, a wealth of ideas, agility, and entrepreneurial flair, coupled with sustainability and Swissness.

Who we are looking for

We are looking for data-savvy Software Architects with experience in designing, developing, and maintaining modern software solutions backed by large-scale Databases, Cloud Solutions and data-processing Pipelines. The routine challenges and the daily interactions with clients and colleagues require a high level of education in Computer Science (or equivalent), a very good level of German and English, flexibility, a native curiosity and willingness to drive innovation.

What you can expect

  • together with our Machine Learning and Software Engineers, you will develop complex data-processing Pipelines in the cloud or on premises
  • you will design, implement and scale data automatization solutions (ETL, data cleaning, data aggregation and deduplication)
  • you will take the leading role in data architecture-related consulting mandates
  • you will be maintaining and expanding the data processing technology stack
  • continuous learning and self-improving in the diverse business areas of our customers are key activities that will become part of your daily routine
  • it is guaranteed that you will have a good time developing challenging projects either in house or at our customers

What you should bring along

  • experience with products such as Elastic Search, Apache Spark, Apache Kafka, Apache Hadoop
  • several years of experience with relational and NoSQL databases
  • excellent programming skills in Java and Python
  • practical knowledge on micro-services and Lambda architectures
  • solid understanding and experience with virtualisation and deployment solutions, such as Docker, Kubernetes and OpenShift
  • experience with usage of could services for persisting and processing large datasets (on AWS, GoogleCloud, Microsoft Azure)
  • extensive practical knowledge in data modelling, ETL processes, especially in combination with NoSQL systems

What we offer

We can offer you an interesting and dynamic working environment with attractive employment conditions. Our company culture is based on mutual respect, commitment, and transparency. Continual internal and external training is very important to us.

What are you waiting for?

Would you like to use your talent and expertise in a team-oriented environment? Send your personal and complete application including cover letter, CV, diplomas, and work references to Please note that we will only consider complete applications. We look forward to meeting you!