Examples of Master projects 2022

Summary:

In an uncertain global context, the Swiss retail market, particularly the grocery retailing sector, is confronted with new challenges. Changing consumer habits, new market entrants and rising production costs are all threats to traditional retailers such as Migros. The current research project took place in this context. This study focuses on the optimization of logistics processes in stores, from delivery receipt to shelf replenishment. To respond to a market that is becoming increasingly fast-paced, an efficient logistic chain is indeed essential. These final stages of the supply chain are all the more critical because they allow for final distribution to consumers. The aim of this project is to suggest a strategy for designing storage spaces in the group’s stores to boost yield and efficiency. A mixed integer optimization program for the layout of the storage departments is detailed next. The purpose of this model is to reduce the distances traveled by the personnel between the backroom areas and the different departments on the sales floor, taking into consideration the replenishment frequency of the different sectors. This model could then be applied to real sales areas, an example of application is also presented. For this specific example, a reduction of more than 10% of the employees’ displacements could be computed. The different enhancements presented as a result of this research allow for increased efficiency and performance, as well as the avoidance of shelf out-of-stocks and the reduction of time committed to logistical duties in order to provide better customer service for instance.-lab is currently working and the reasons behind, through data collection and visualiIn an uncertain global context, the Swiss retail market, particularly the grocery retailing sector,  is confronted with new challenges. Changing consumer habits, new market entrants and rising production costs are all threats to traditional retailers such as Migros. The current research project took place in this context. This study focuses on the optimization of logistics processes in stores, from delivery receipt to shelf replenishment. To respond to a market that is becoming increasingly fast-paced, an efficient logistic chain is indeed essential. These final stages of the supply chain are all the more critical because they allow for final distribution to consumers. The aim of this project is to suggest a strategy for designing storage spaces in the group’s stores to boost yield and efficiency. A mixed integer optimization program for the layout of the storage departments is detailed next. The purpose of this model is to reduce the distances traveled by the personnel between the backroom areas and the different departments on the sales floor, taking into consideration the replenishment frequency of the different sectors. This model could then be applied to real sales areas, an example of application is also presented. For this specific example, a reduction of more than 10% of the employees’ displacements could be computed. The different enhancements presented as a result of this research allow for increased efficiency and performance, as well as the avoidance of shelf out-of-stocks and the reduction of time committed to logistical duties in order to provide better customer service for instance.

Summary:
Payments are everywhere. Nowadays, using our credit card has become a simple seamless everyday experience, but in the background, the process is a complex one, involving tens of distinct steps and multiple players operating along the value chain. Behind this opaque ecosystem, we find a high-growth, highly competitive, fragmented industry, in which Covid-19 has accelerated many of the profound ongoing behaviour shifts towards online, contactless, new payment forms and infrastructure. As a result, and more than ever, incumbents are under pressure to innovate fast against disruptive new-entrants leveraging best-in-class services offerings and strong financial backing from investors.
In this context, we argue that, in highly fragmented, competitive, and tech-focused industries like the payment industry, M&A can be leveraged as an integrated strategy providing long-term sustained competitive advantage.
This paper contributes by exploring the payment industry competitive landscape and key trends going forward. Moreover, we develop a conceptual framework to understand the reasoning of managers behind the pursue of inorganic strategies. In that direction, we identify key strategic considerations, outline consolidation trends in the payment industry, and draw parallels with a
real M&A transaction.

Results:
Our research shows that the argument holds true under certain market and firm-specific conditions. In particular, we outline the benefits of M&A as an efficient consolidation tool in capex-intensive industries with network effects, enabling economies of scale, imitation protection and fast-innovation. Moreover, speed and foresight of transaction execution are seen as critical for a successful long-term integrated strategy of inorganic growth.

Summary:
In order to address some of its greatest challenges faced today, society needs radical innovation.
Venture Capital has been a primary enabler of innovation over the past few decades but has benefitted some areas of technological progress more than others. Deep Tech traditionally did not meet the mould for VC financing due to its long development cycle and early capital-intensity, though as these societal challenges become more pertinent, some VC firms and ecosystems are adapting themselves to support this area. Being regarded as the most innovative country in the world and bearing a history of competing beyond its size, Switzerland has the foundations to become a leading global Deep Tech ecosystem. However, despite housing a significant proportion of the world’s wealth, there is a lack of capital available in the country to fund the development and commercialisation of radical innovation. This paper investigates the hesitancy of Venture Capital in Switzerland through a series of interviews with capital providers and investment professionals in order to understand what drives the investment decision of Limited Partners and to assess how Swiss Deep Tech VC firms can address it to increase capital flows.


Results:
The interviews highlighted that whilst there is recognition from sophisticated investors that the Private Equity and VC asset classes are necessary to maintain strong portfolio returns amongst increased inflation, most Swiss investors are still too inexperienced to effectively optimise their allocation to them. This historic and cultural hesitancy due to the prudent nature of the Swiss investor has led to an investment mindset of “capital conservation” rather than capital growth. The consequential lack of interest means that the necessary relationships have not been developed to overcome the perception of operational and reputational risk linked to the asset classes. Swiss Deep Tech VCs must facilitate a change in investment mentality by speaking to the long-term orientation of Swiss investors. They must adopt a coordinated strategic approach that recognises the particularities of the Swiss investor and that incorporates education, standardisation and benchmarking in order to showcase success. But most importantly this strategy must build strong, trusted, and long-term relationships whereby VC in general and Deep Tech VC in particular, becomes a key element in strategic investment portfolios.

Summary:

The aim of this study is to bring a new way of looking at real estate decision-making, which currently relies very little on automated decisions, but rather on human instinct. Therefore, the study is intended to be comprehensive and innovative in terms of the decisions made on Swiss properties in the bank’s real estate portfolio.
The study aims to show the potential of automated decision-making processes in real estate departments while implementing the newly created algorithms. The use of a dataset related to properties’ energy consumption is used, in addition to the bank’s general property dataset, and also to an online advertisement platform where buyers and sellers can find their transaction’s complementary part.
The study is divided into three parts. The first part aims to determine as precisely as possible the return on investment of properties and their CO2 emissions. This task involves working on scraping the energy data, and also on the process used to calculate the price of a property. The second part uses these two factors to optimize the property portfolio, and then the properties that are defined as unsatisfactory according to the criteria have to be classified as for renovation or for sale. The latter rarely happens in reality, but it is still necessary to sell these unsatisfactory properties in an optimal way. Therefore, in the last part, predictions of the future values of these properties will be made, and then a second optimization work will appear in order to maximize the financial gain of these sales. Finally, the energy and financial gains will be compared to the initial state, and the general limitations of the model as a whole will be underlined.

Results:

In a field where data availability is low, the possibilities for data science exploration are limited. Indeed, the data is aggregated by property and only allows analyses of the portfolio as a whole, which was done in this study. The aim was to optimize the portfolio according to two criteria that seem to be the major decision criteria in the coming decades to respect the CO2 reduction targets while remaining in the context of maximizing financial profits. Using algorithms, it was shown that a reduction in emissions of 48% with a yield gain of 0.06 percentage points could be achieved by reducing the number of properties in the portfolio from 397 to 352.
Next, a proposal for the properties no longer in the portfolio was made, based on their financial potential. This amounts to determining whether renovating or selling is the best option. Renovations could be assessed in detail if accurate information on the elements of each building was available, which is not yet the case but could be done through BIM for instance. This would allow a precise understanding of the energy behavior of the buildings according to their elements considered as a whole and thus play on the materials and thicknesses to be used. Finally, the theoretical study of the sale of buildings was carried out, making it possible to determine which of the properties classified as “for sale” could be sold at what price, to maximize profits. Despite the theoretical aspect of the method combined with results that may seem optimistic (10% gain on the price of 45 properties for sale), these algorithms can easily be put into practice to at least test their effectiveness and potentially maximize the sale prices by a few percent. To conclude, encouraging results have been found, but there are still grey areas to be improved, such as a study of buildings to be bought to replace those sold, a detailed analysis of the marginal effect of a franc invested to know the reduction of CO2 during a renovation, or an analysis of each object to determine which ones potentially have a behavior occupant to be reviewed to take actions accordingly. The latter two proposals are based on the assumption that very detailed building data is available, which is complicated by the lack of data centralization and data protection respectively. The first problem concerning the lack of centralized data needs to be fixed as soon as possible, to push the analyses as far as possible and become a leader in this field.

Summary:
Merck Healthcare has not stopped growing since its creation, and its supply chain gets increasingly complex over time. Having visibility is crucial for its optimization and long-range planning is one of the key steps. Taking highly critical supply chain strategic decisions cannot be supported by a poor and manual process, so a more systematic solution must be provided through well-defined processes, roles, and responsibilities.
This project aims to establish a simplified supply chain model usable for an eight-year horizon. It will then be possible to study the allocation between sites, lines, and capacity consumption for this time period. The solution needs to allow all functions to collaborate and provide their insights in the level of granularity they are comfortable with. Strategic planning needs the agility to test different assumptions, which cannot be done if every scenario requires serious effort.

Results:
The goal of this project is to develop a tool for long-range planning that is flexible and user-friendly.
The idea is that a simplified supply chain model is created automatically at the lowest granularity by leveraging other systems. Technology allows to manage massive amounts of data in a structured way based on given assumptions, so it is no longer necessary to set up supply chain models at an aggregated level. Since the model is at the lowest granularity, it can be aggregated to see the average allocation by a family of products and regions. This model can then be used to extrapolate each line’s load and know each product family’s contribution. Afterward, it can be reviewed and modified at an aggregated level according to business insights. The technology will allow to disaggregate and mass-change the model directly: users can test different simulations, and the tool recomputes the long-range plan based on the override entered without much effort from the user. Disaggregation assumptions are fully transparent,
and functionalities such as aggregated planning and drill-drown are provided to verify the disaggregation effect. The long-range plan can then be automatically compared with tactical and next year’s budget plans, and gaps can be easily identified.

Summary:
This master thesis deals with performance management in a tobacco company working to deliver a smoke-free future. It is based on the need to develop measures to identify and assess the current performance to achieve the reduction of Time to Market as speed at which companies can introduce products into the market are critical for sustaining competitive advantage and market share. This initiative aims at defining and building a standard data leveraged tool to identify key metrics contributing to the speed and quality of the development and innovation projects within the company, bringing visibility across stakeholders on measures, insights, and prescriptive actions. The project followed an agile methodology and incorporated a cross-functional team. The different stakeholders and requirements prioritization made this thesis challenging.

Results:
The main result is the development of the MVP for two data products that include critical metrics that contribute to the reduction of time to market. The data product helps the organization know the current status of these KPIs, the reason why, and provides insight into possible future outcomes if any action is taken. In addition, it provides the option to compare KPIs versus different references. Finally, the thesis highlights the steps that still need to be implemented due to the current scope and some recommendations of what could be done to gain more value from these data products.

Summary:

With the different crisis that the world is facing and the increasing competitiveness of companies, it has become essential to develop tools and methods to face the different challenges and to maintain reactivity to be able to respond effectively to the demand and reduce costs. Thus, in this report, we want to develop a method that we have named Read & React (R&R), allowing us to effectively manage the stocks of different products, whether they are products in range, whose behavior is known, and which are part of the permanent offer of Cartier, but also the new products that are launched and which often present uncertainties, especially during the establishment of the forecast. This method, as its name indicates, will help us first to determine if the product is a candidate and try to analyze the first sales indicators (Read). Then we want to be able to react effectively according to the different scenarios (React). It is about considering the various problems which arise during the planning of the production until the routing of the parts towards the markets.

We have therefore first applied a calculation method to determine the safety stocks of products in range at the Central Distribution center of Cartier. This is part of the React part of the method. We want to take into account the different uncertainties that we can face (demand & supply delays) and have precise calculations down to the details of the product. The results obtained allowed us to rebalance these stocks according to the needs and to have a uniform and automated formula that also considers the sales performance. Compared to what was done before, we were able to add in the calculation the uncertainty of the delays, as well as the error between the real sales and the forecast.

In the second phase, we developed a roadmap for the launch of new products. Thus, first in the Read part, we have set up a method to determine if the product can be processed in R&R and if so, evaluate the first sales signals. Then, in the React part, we evaluated the benefits of setting up Raw Materials & Components or Semi-finished products, to be able to perform what is called delayed differentiation. The goal is to produce a well-defined and non-exhaustive quantity at the beginning of the launch and to wait for the first sales signals to produce more or not. This will allow us to reduce the losses during the launches while trying to maintain reactivity since these R&C and semi-finished products will allow us to reduce the lead times and thus to produce more quickly.

Results: 

Both parts of the method had satisfactory results. The new method for the calculation of the safety stock is already implemented and allowed us to rebalance well the stocks according to the needs and reduce them by approximately 30 %. The part concerning the launch of new products seems to be more complicated to implement. Even if theoretically, this method seems to work and allowed to reduce the losses compared to the traditional method, the reality on the ground requires the implementation of several improvements at the supplier level in order to implement it. These changes should allow a better control, efficiency and risk reduction. Moreover, we will need to onboard the different departments and convince them that this method is an opportunity to gain flexibility and rationalization rather than a source of supply chain destabilization.

Summary:
Digital transformation – making business decisions following real-time data analysis of information – has become a standard of Industry 4.0, taking various forms according to the sectors where it is implemented. Part of Merck Group’s operational excellence objectives – productivity, efficiency, integration – digitalization is applied in the Global Regulatory Affairs department of the Healthcare organization to drive sustainable change. This project focused on transforming the planning, tracking, and reporting processes related to any regulatory activity part of the team’s responsibilities. It integrated the automatization of visual representation based on modern IT tools, files and functions connectivity, as well human resource monitoring. While optimizing the planning task, the project also addressed human resource management, with the help of appropriate models and quantification techniques. This aspect notably led to balancing the constant research of efficiency and precision with the risks of work alienation.


Results:
Because sustainable change is not without acceptation of change, users’ satisfaction was a clear requirement of the project, with several Voice of Customer analysis integrated in the Lean Six Sigma project management approach. It led to a strong usefulness ranking for the tool and to a significant impact on easing and shortening the planning task for regulatory specialists. Because of the small size of the study group, the experiment and model validity would need longer post-change studies to be assessed, notably questioned by the trade-off between precision of workflows quantification and related constraints for the work environment.

Summary:

Recent external events such as the covid-19 pandemic and the tremendous increase in European energy prices have led to a rapidly growing Smart Building market. Many companies are quickly transitioning to a hybrid work model backed by the willingness to reduce what is considered unnecessary office space. This transition process translates into decreasing office space and into large relocation projects, made only possible with adequate space allocation solutions and the implementation of intelligent hybrid work policies.  Additionally, these projects are often a source of awareness on the scale economies that can be made on energy and maintenance, by implementing quick-win Smart Building software and hardware.

However, due to the sudden nature of this transition, real-world companies have poor knowledge of the available solutions and a low capacity for evaluating them and taking adequate strategic decisions.   

Results:

The first investigations made clear that return on investment was highly project-dependent and very hard to measure for the Smart Building industry. It also concluded on a low awareness of the existing market and a large lack of compatibility between different solutions. In this context, an interview-based approach with a large stakeholders panel was conducted and led to the creation of an evaluation framework for determining the solutions to implement and choosing among existing solutions. This framework is composed of a functional diagram of the Smart Building systems as well as a detailed guideline of each component of a Smart Building project, based on relevance (“How strong do users need it? Who is the relevant user?”), implementation/integration (“How easy is it to be implemented technically? How well does it integrate with current solutions?”) and efficiency (“How well does it work?”).  

Summary:
This thesis concerns the optimisation of a logistics network using the milk run system. The first goal is to build a mathematical model that translates the optimisation problem, considering different KPIs. The second goal is to develop an approach that allows to solve the problem as a mixed-integer linear problem, and then run the optimisation of an example network by using a solver that will build the optimised milk runs. The third goal is to show the importance of one of the KPIs in the optimisation, and to show that the previous optimisations that were not taking it into account in the objective function of the problem were leading to negative financial impacts for the company.

Results:

The real-life optimisation problem was translated into a mixed-integer problem that was not linear. A two-step strategy was developed to translate it into a mixed-integer linear problem: we were thus able to solve it with the FICO XPRESS solver. 35 milk runs were built by the algorithm to optimise the hypothetical network around Europe, leading to an annual profit of +4MM€. The algorithm was run a second time with the objective function not considering one of our KPIs, and it was shown that omitting it previously was leading to very negative results for the company: -5MM€.