Refine
Year of publication
Institute
- Fachbereich Wirtschaftswissenschaften (224) (remove)
Language
- English (224) (remove)
Document Type
- Article (116)
- Conference Proceeding (57)
- Book (32)
- Part of a Book (15)
- Working Paper (3)
- Doctoral Thesis (1)
Keywords
- Telekommunikationsmarkt (4)
- Führung (3)
- Leadership (3)
- rebound-effect (2)
- regulation (2)
- sustainability (2)
- Active learning (1)
- Automation (1)
- Bank-issued Warrants (1)
- Brands (1)
Info-Web-Generation
(2004)
Today’s society is undergoing a paradigm shift driven by the megatrend of sustainability. This undeniably affects all areas of Western life. This paper aims to find out how the luxury industry is dealing with this change and what adjustments are made by the companies. For this purpose, interviews were conducted with managers from the luxury industry, in which they were asked about specific measures taken by their companies as well as trends in the industry. In a subsequent evaluation, the trends in the luxury industry were summarized for the areas of ecological, social, and economic sustainability. It was found that the area of environmental sustainability is significantly more focused than the other sub-areas. Furthermore, the need for a customer survey to validate the industry-based measures was identified.
We introduce a new way to measure the forecast effort that analysts devote to their earnings forecasts by measuring the analyst's general effort for all covered firms. While the commonly applied effort measure is based on analyst behaviour for one firm, our measure considers analyst behaviour for all covered firms. Our general effort measure captures additional information about analyst effort and thus can identify accurate forecasts. We emphasise the importance of investigating analyst behaviour in a larger context and argue that analysts who generally devote substantial forecast effort are also likely to devote substantial effort to a specific firm, even if this effort might not be captured by a firm-specific measure. Empirical results reveal that analysts who devote higher general forecast effort issue more accurate forecasts. Additional investigations show that analysts' career prospects improve with higher general forecast effort. Our measure improves on existing methods as it has higher explanatory power regarding differences in forecast accuracy than the commonly applied effort measure. Additionally, it can address research questions that cannot be examined with a firm-specific measure. It provides a simple but comprehensive way to identify accurate analysts.
The number of electronic vehicles increase steadily while the space for extending the charging infrastructure is limited. In particular in urban areas, where parking spaces in attractive areas are famous, opportunities to setup new charging stations is very limited. This leads to an overload of some very attractive charging stations and an underutilization of less attractive ones. Against this background, the paper at hand presents the design of an e-vehicle reservation system that aims at distributing the utilization of the charging infrastructure, particularly in urban areas. By applying a design science approach, the requirements for a reservation-based utilization approach are elicited and a model for a suitable distribution approach and its instantiation are developed. The artefact is evaluated by simulating the distribution effects based on data of real charging station utilizations.
Many companies still conduct the worldwide management of people as if neither the external economic nor the internal structure of the firm had changed. The costs of cross-cultural failure, for individuals and their companies, are enormous: personal and family costs; financial, professional and emotional costs; costs to one’s career prospects, to one’s self-esteem, to one’s marriage and family. This scenario describes sufficiently the reason for learning “the art of crossing cultures” (Craig Storti). To this end, this research paper describes an innovative approach of cross-cultural training, following the didactical ideas of Kolb and Fry, the so-called 'experiential learning'.
Domain experts regularly teach novice students how to perform a task. This often requires them to adjust their behavior to the less knowledgeable audience and, hence, to behave in a more didactic manner. Eye movement modeling examples (EMMEs) are a contemporary educational tool for displaying experts’ (natural or didactic) problem-solving behavior as well as their eye movements to learners. While research on expert-novice communication mainly focused on experts’ changes in explicit, verbal communication behavior, it is as yet unclear whether and how exactly experts adjust their nonverbal behavior. This study first investigated whether and how experts change their eye movements and mouse clicks (that are displayed in EMMEs) when they perform a task naturally versus teach a task didactically. Programming experts and novices initially debugged short computer codes in a natural manner. We first characterized experts’ natural problem-solving behavior by contrasting it with that of novices. Then, we explored the changes in experts’ behavior when being subsequently instructed to model their task solution didactically. Experts became more similar to novices on measures associated with experts’ automatized processes (i.e., shorter fixation durations, fewer transitions between code and output per click on the run button when behaving didactically). This adaptation might make it easier for novices to follow or imitate the expert behavior. In contrast, experts became less similar to novices for measures associated with more strategic behavior (i.e., code reading linearity, clicks on run button) when behaving didactically.
The FAYMONVILLE case study describes how the family-owned company Faymonville from eastern Belgium has succeeded in becoming one of the leading manufacturers in its sector. The targeted identification of new markets, the focus on relevant customer needs, and a consistent product policy with a coordinated manufacturing concept lay the foundations for success. In this case study, students can learn about how a company can successfully resolve the fundamental contradiction between economic and customized production.
Extracting workflow nets from textual descriptions can be used to simplify guidelines or formalize textual descriptions of formal processes like business processes and algorithms. The task of manually extracting processes, however, requires domain expertise and effort. While automatic process model extraction is desirable, annotating texts with formalized process models is expensive. Therefore, there are only a few machine-learning-based extraction approaches. Rule-based approaches, in turn, require domain specificity to work well and can rarely distinguish relevant and irrelevant information in textual descriptions. In this paper, we present GUIDO, a hybrid approach to the process model extraction task that first, classifies sentences regarding their relevance to the process model, using a BERT-based sentence classifier, and second, extracts a process model from the sentences classified as relevant, using dependency parsing. The presented approach achieves significantly better resul ts than a pure rule-based approach. GUIDO achieves an average behavioral similarity score of 0.93. Still, in comparison to purely machine-learning-based approaches, the annotation costs stay low.
Goal Driven Business Modelling - Supporting Decision Making within Information System Development
(1995)
With a steady increase of regulatory requirements for business processes, automation support of compliance management is a field garnering increasing attention in Information Systems research. Several approaches have been developed to support compliance checking of process models. One major challenge for such approaches is their ability to handle different modeling techniques and compliance rules in order to enable widespread adoption and application. Applying a structured literature search strategy, we reflect and discuss compliance-checking approaches in order to provide an insight into their generalizability and evaluation. The results imply that current approaches mainly focus on special modeling techniques and/or a restricted set of types of compliance rules. Most approaches abstain from real-world evaluation which raises the question of their practical applicability. Referring to the search results, we propose a roadmap for further research in model-based business process compliance checking.
Gamification and gamified information systems (GIS) apply video game elements to encourage the work on boring and everyday tasks. Meanwhile, several research works provide evidence that gamification increases efficiency and effectivity of such tasks. The paper at hand investigates the health care sector, which is challenged with cost pressure and suffers in process efficiency. We hypothesize that GIS may improve the efficiency and quality of care processes. By applying an interview-based content analysis, the paper at hand evaluates gamification elements in an assisted living environment and provides three research contributions. First, insights into relevant GIS affordances and application examples for assisted living facilities are given. Second, assisted living experts evaluate GIS design guidelines. Both the relevant affordances and design principles comprise a basis for the development of a GIS for social workers in assisted living facilities. Third, potential adoption barriers and design guidelines for GIS in assisted living are presented.
A Gamified Information System (GIS) implements game concepts and elements, such as affordances and game design principles to motivate people. Based on the idea to develop a GIS to increase the motivation of software developers to perform software quality tasks, the research work at hand aims at investigating relevant requirements from that target group. Therefore, 14 interviews with software development experts are conducted and analyzed. According to the results, software developers prefer the affordances points, narrative storytelling in a multiplayer and a round-based setting. Furthermore, six design principles for the development of a GIS are derived.
Researching the field of business intelligence and analytics (BI & A) has a long tradition within information systems research. Thereby, in each decade the rapid development of technologies opened new room for investigation. Since the early 1950s, the collection and analysis of structured data were the focus of interest, followed by unstructured data since the early 1990s. The third wave of BI & A comprises unstructured and sensor data of mobile devices. The article at hand aims at drawing a comprehensive overview of the status quo in relevant BI & A research of the current decade, focusing on the third wave of BI & A. By this means, the paper’s contribution is fourfold. First, a systematically developed taxonomy for BI & A 3.0 research, containing seven dimensions and 40 characteristics, is presented. Second, the results of a structured literature review containing 75 full research papers are analyzed by applying the developed taxonomy. The analysis provides an overview on the status quo of BI & A 3.0. Third, the results foster discussions on the predicted and observed developments in BI & A research of the past decade. Fourth, research gaps of the third wave of BI & A research are disclosed and concluded in a research agenda.
Manufacturing companies are forced to operate in an increasingly volatile and unpredictabl environment. The number of events that can have a potentially critical impact on a production system‘s economic performance have significantly increased. This forces companies to invest considerably more in flexible and robust production systems capable of withstanding a certain amount of change however unable to quantify the benefits in advance. The satisfactory quantification and assessment of these qualities – Flexibility and Robustness –has not been realized yet. This paper discusses commonality between Flexibility and Robustness and offers a new approach to connect changes in the environment with the elements of a production system and thus quantifying its flexibility and robustness.
Purpose
In the determination of the measurement uncertainty, the GUM procedure requires the building of a measurement model that establishes a functional relationship between the measurand and all influencing quantities. Since the effort of modelling as well as quantifying the measurement uncertainties depend on the number of influencing quantities considered, the aim of this study is to determine relevant influencing quantities and to remove irrelevant ones from the dataset.
Design/methodology/approach
In this work, it was investigated whether the effort of modelling for the determination of measurement uncertainty can be reduced by the use of feature selection (FS) methods. For this purpose, 9 different FS methods were tested on 16 artificial test datasets, whose properties (number of data points, number of features, complexity, features with low influence and redundant features) were varied via a design of experiments.
Findings
Based on a success metric, the stability, universality and complexity of the method, two FS methods could be identified that reliably identify relevant and irrelevant influencing quantities for a measurement model.
Originality/value
For the first time, FS methods were applied to datasets with properties of classical measurement processes. The simulation-based results serve as a basis for further research in the field of FS for measurement models. The identified algorithms will be applied to real measurement processes in the future.
Explorer CEOs: The effect of CEO career variety on large firms’ relative exploration orientation
(2018)
Prior studies demonstrate that firms need to make smart trade-off decisions between exploration and exploitation activities in order to increase performance. Chief executive officers (CEOs) are principal decision makers of a firm’s strategic posture. In this study, we theorize and empirically examine how relative exploration orientation of large publicly listed firms varies based on the career variety of their CEOs – that is, how diverse the professional experiences of executives were prior to them becoming CEOs. We further argue that the heterogeneity and structure of the top management team moderates the impact of CEO career variety on firms’ relative exploration orientation. Based on multisource secondary data for 318 S&P 500 firms from 2005 to 2015, we find that CEO career variety is positively associated with relative exploration orientation.
Interestingly, CEOs with high career varieties appear to be less effective in pursuing exploration, when they work with highly heterogeneous and structurally interdependent top management teams.
Enterprise SOA Roadmap
(2008)
Working paper distributed at 2nd Annual Next Generation Telecommunications Conference 2009, 13th – 14th October 2009, Brussels 14 pages Abstract Governments all over Europe are in the process of adopting new broadband strategies. The objective is to create modern telecommunications networks based on powerful broadband infrastructures". In doing so, they aim for innovative and investment-friendly concepts. For instance, in a recently published consultation paper on the subject the German regulator BNetzA declared that it will take “greater account of … reducing risks, securing the investment and innovation power, providing planning certainty and transparency – in order to support and advance broadband rollout in Germany”. It further states that when regulating wholesale rates it has to be ensured that “… adequate incentives for network rollout are provided on the one hand, while sustainable and fair competition is ensured on the other”. Also an EC draft recommendation on regulated network access is about to set new standards for the regulation of next generation access networks. According to the recommendation the prices of new assets shall be based on costs plus a projectspecific risk premium to be included in the costs of capital for the investment risk incurred by the operator. This approach has been criticised from various sides. In particular it has been questioned whether such an approach is adequate to meet the objectives of encouraging both competition and investment into next generation access networks. Against this background, the concept of “long term risk sharing contracts” has been proposed recently as an approach which does not only incorporate the various additional risks involved in the deployment of NGA infrastructure, but has several other advantages. This paper will demonstrate that the concept allows for competition to evolve at both the retail and wholesale level on fair, objective, non-discriminatory and transparent terms and conditions. Moreover, it ensures the highest possible investment incentive in line with socially desirable outcome. The paper is organised as follows: The next section will briefly outline the importance of encouraging competition and investment in an NGA-environment. The third section will specify the design of long term risk sharing contracts in view of achieving these objectives. The fourth section will examine potential problems associated with the concept. In doing so a way of how to deal with them will be elaborated. The last section will look at arguments against long term risk sharing contracts. It will be shown that these arguments are not strong enough to build a case against introducing such contracts.
Does stiffer electoral competition reduce political shirking? For a micro-analysis of this question, I construct a new data set spanning the years 2005 to 2012 covering biographical and political information about German Members of Parliament (MPs), including their attendance rates in voting sessions. For the parliament elected in 2009, I show that indeed opposition party MPs who expect to face a close race in their district show significantly and relevantly lower absence rates in parliament beforehand. MPs of governing parties seem not to react significantly to electoral competition. These results are confirmed by an analysis of the parliament elected in 2005, by several robustness checks, and also by employing an instrumental variable strategy exploiting convenient peculiarities of the German electoral system. The study also shows how MPs elected via party lists react to different levels of electoral competition.
Divided government is often thought of as causing legislative deadlock. I investigate the link between divided government and economic reforms using a novel data set on welfare reforms in US states between 1978 and 2010. Panel data regressions show that, under divided government, a US state is around 25% more likely to adopt a welfare reform than under unified government. Several robustness checks confirm this counter-intuitive finding. Case study evidence suggests an explanation based on policy competition between governor, senate, and house.
Die Garantie im Kaufrecht
(1995)
A key feature of future broadband markets will be diversity of access technologies, meaning that numerous technologies will be exploited for broadband communication. Various factors will affect the success of these future broadband markets, the regulatory policy being one amongst others. So far, a coherent regulatory approach does not exist as to broadband markets. First results of policies so far suggest that less sector-specific regulation is likely to occur. Instead, regulators must ensure that access to networks and services of potentially dominant providers in a relevant broadband market will satisfy requirements for openness and non-discrimination. In this environment the future challenge of regulationg broadband markets will be to set the right incentives for investment into new infrastructures. This paper examines whether there is a need for the regulation of future broadband access markets an if yes, what is the appropriate regulatory tool to do so. Thereby the focus is on the analysis of European broadband markets and the regulatory approaches applied. The first section provides a description of the characteristics of future broadband markets. The second section discusses possible bottlenecks on broadband markets an their regulatory implications. The third section will examine regulatory issues concerning access to broadband networks in more detail. This will be done by comparing the regulatory approaches of European countries and the results in terms of bradband penetration. The final section will give key recommendations for a regulatory strategy on brandband access markets.
Determinants of earnings forecast error, earnings forecast revision and earnings forecast accuracy
(2012)
Earnings forecasts are ubiquitous in today’s financial markets. They are essential indicators of future firm performance and a starting point for firm valuation. Extremely inaccurate and overoptimistic forecasts during the most recent financial crisis have raised serious doubts regarding the reliability of such forecasts. This thesis therefore investigates new determinants of forecast errors and accuracy. In addition, new determinants of forecast revisions are examined. More specifically, the thesis answers the following questions: 1) How do analyst incentives lead to forecast errors? 2) How do changes in analyst incentives lead to forecast revisions?, and 3) What factors drive differences in forecast accuracy?
Cure or blessing? The effect of (non-financial) signals on sustainable venture's funding success
(2022)
We analyze the trading behavior of individual investors in option-like securities, namely bankissued warrants, and thus expand the growing literature of investors behavior to a new kind of securities. A unique data set from a large German discount broker gives us the opportunity to analyze the trading behavior of 1,454 investors, making 89,958 transactions in 6,724 warrants on 397 underlyings. In different logit regression, we make use of the facts that investors can speculate on rising and falling prices of the underlying with call and put warrants and that we also have information about the stock portfolios of the investors. We report several facts about the trading behavior of individual investors in warrants that are consistent with the literature on the behavior of individual investors in the stock market. The warrant investors buy calls and sell puts if the price of the underlying has decreased over the past trading days and they sell calls and buy puts if the price of the underlying has increased. That means, the investors follow negative feedback trading strategies in all four trading categories observed. In addition, we find strong evidence for the disposition effect for call as well as put warrants, which is reversed in December. The trading behavior is also influenced if the underlying reaches some exceptionally prices, e.g. highs, lows or the strike price. We show that hedging, as one natural candidate to buy puts, does not play an important role in the market for bank-issued warrants.
Given the strong increase in regulatory requirements for business processes the management of business process compliance becomes a more and more regarded field in IS research. Several methods have been developed to support compliance checking of conceptual models. However, their focus on distinct modeling languages and mostly linear (i.e., predecessor-successor related) compliance rules may hinder widespread adoption and application in practice. Furthermore, hardly any of them has been evaluated in a real-world setting. We address this issue by applying a generic pattern matching approach for conceptual models to business process compliance checking in the financial sector. It consists of a model query language, a search algorithm and a corresponding modelling tool prototype. It is (1) applicable for all graph-based conceptual modeling languages and (2) for different kinds of compliance rules. Furthermore, based on an applicability check, we (3) evaluate the approach in a financial industry project setting against its relevance for decision support of audit and compliance management tasks.
Books Reviewed - European Democratization since 1800 edited by J. Garrard, V. Tolz and R. White
(2000)
In the past decade, many IS researchers focused on researching the phenomenon of Big Data. At the same time, the relevance of data protection gets more attention than ever before. In particular, since the enactment of the European General Data Protection Regulation in May 2018 Information Systems research should provide answers for protecting personal data. The article at hand presents a structuring framework for Big Data research outcome and the consideration of data protection. IS Researchers might use the framework in order to structure Big Data literature and to identify research gaps that should be addressed in the future.
The use of industrial robots allows the precise manipulation of all components necessary for setting up a large-scale particle image velocimetry (PIV) system. The known internal calibration matrix of the cameras in combination with the actual pose of the industrial robots and the calculated transform from the fiducial markers to camera coordinates allow the precise positioning of the individual PIV components according to the measurement demands. In addition, the complete calibration procedure for generating the external camera matrix and the mapping functions for e.g. dewarping the stereo images can be automatically determined without further user interaction and thus the degree of automation can be extended to nearly 100%. This increased degree of automation expands the applications range of PIV systems, in particular for measurement tasks with severe time constraints.
Lifting propellers are of increasing interest for Advanced Air Mobility. All propellers and rotors are initially twisted beams, showing significant extension–twist coupling and centrifugal twisting. Torsional deformations severely impact aerodynamic performance. This paper presents a novel approach to assess different reasons for torsional deformations. A reduced-order model runs large parameter sweeps with algebraic formulations and numerical solution procedures. Generic beams represent three different propeller types for General Aviation, Commercial Aviation, and Advanced Air Mobility. Simulations include solid and hollow cross-sections made of aluminum, steel, and carbon fiber-reinforced polymer. The investigation shows that centrifugal twisting moments depend on both the elastic and initial twist. The determination of the centrifugal twisting moment solely based on the initial twist suffers from errors exceeding 5% in some cases. The nonlinear parts of the torsional rigidity do not significantly impact the overall torsional rigidity for the investigated propeller types. The extension–twist coupling related to the initial and elastic twist in combination with tension forces significantly impacts the net cross-sectional torsional loads. While the increase in torsional stiffness due to initial twist contributes to the overall stiffness for General and Commercial Aviation propellers, its contribution to the lift propeller’s stiffness is limited. The paper closes with the presentation of approximations for each effect identified as significant. Numerical evaluations are necessary to determine each effect for inhomogeneous cross-sections made of anisotropic material.
Process mining gets more and more attention even outside large enterprises and can be a major benefit for small and medium sized enterprises (SMEs) to gain competitive advantages. Applying process mining is challenging, particularly for SMEs because they have less resources and process maturity. So far, IS researchers analyzed process mining challenges with a focus on larger companies. This paper investigates the application of process mining by means of a case study and sheds light into the particular challenges of an IT SME. The results reveal 13 SME process mining challenges and seven guidelines to address them. In this way, the paper contributes to the understanding of process mining application in SME and shows similarities and differences to larger companies.
Software development projects often fail because of insufficient code quality. It is now well documented that the task of testing software, for example, is perceived as uninteresting and rather boring, leading to poor software quality and major challenges to software development companies. One promising approach to increase the motivation for considering software quality is the use of gamification. Initial research works already investigated the effects of gamification on software developers and come to promising. Nevertheless, a lack of results from field experiments exists, which motivates the chapter at hand. By conducting a gamification experiment with five student software projects and by interviewing the project members, the chapter provides insights into the changing programming behavior of information systems students when confronted with a leaderboard. The results reveal a motivational effect as well as a reduction of code smells.
The successful implementation and continuous development of sustainable corporate-level solutions is a challenge. These are endeavours in which social, environmental, and financial aspects must be weighed against each other. They can prove difficult to handle and, in some cases, almost unrealistic. Concepts such as green controlling, IT, and manufacturing look promising and are constantly evolving. This paper aims to achieve a better understanding of the field of corporate sustainability (CS). It will evaluate the hypothesis by which Corporate Sustainability thrives, via being efficient, increasing the performance, and raising the value of the input of the enterprises to the resources used. In fact, Corporate Sustainability on the surface could seem to contradict the idea, which supports the understanding that it encourages the reduction of the heavy reliance on the use of natural resources, the overall environmental impact, and above all, their protection. To understand how the contradictory notion of CS came about, in this part of the paper, the emphasis is placed on providing useful insight to this regard. The first part of this paper summarizes various definitions, organizational theories, and measures used for CS and its derivatives like green controlling, IT, and manufacturing. Second, a case study is given that combines the aforementioned sustainability models. In addition to evaluating the hypothesis, the overarching objective of this paper is to demonstrate the use of green controlling, IT, and manufacturing in the corporate sector. Furthermore, this paper outlines the current challenges and possible directions for CS in the future.
Supervised machine learning and deep learning require a large amount of labeled data, which data scientists obtain in a manual, and time-consuming annotation process. To mitigate this challenge, Active Learning (AL) proposes promising data points to annotators they annotate next instead of a subsequent or random sample. This method is supposed to save annotation effort while maintaining model performance.
However, practitioners face many AL strategies for different tasks and need an empirical basis to choose between them. Surveys categorize AL strategies into taxonomies without performance indications. Presentations of novel AL strategies compare the performance to a small subset of strategies. Our contribution addresses the empirical basis by introducing a reproducible active learning evaluation (ALE) framework for the comparative evaluation of AL strategies in NLP.
The framework allows the implementation of AL strategies with low effort and a fair data-driven comparison through defining and tracking experiment parameters (e.g., initial dataset size, number of data points per query step, and the budget). ALE helps practitioners to make more informed decisions, and researchers can focus on developing new, effective AL strategies and deriving best practices for specific use cases. With best practices, practitioners can lower their annotation costs. We present a case study to illustrate how to use the framework.
AI-based systems are nearing ubiquity not only in everyday low-stakes activities but also in medical procedures. To protect patients and physicians alike, explainability requirements have been proposed for the operation of AI-based decision support systems (AI-DSS), which adds hurdles to the productive use of AI in clinical contexts. This raises two questions: Who decides these requirements? And how should access to AI-DSS be provided to communities that reject these standards (particularly when such communities are expert-scarce)? This chapter investigates a dilemma that emerges from the implementation of global AI governance. While rejecting global AI governance limits the ability to help communities in need, global AI governance risks undermining and subjecting health-insecure communities to the force of the neo-colonial world order. For this, this chapter first surveys the current landscape of AI governance and introduces the approach of relational egalitarianism as key to (global health) justice. To discuss the two horns of the referred dilemma, the core power imbalances faced by health-insecure collectives (HICs) are examined. The chapter argues that only strong demands of a dual strategy towards health-secure collectives can both remedy the immediate needs of HICs and enable them to become healthcare independent.
Integrated voice assistants (IVA) receive more and more attention and are widespread for entertainment use cases, such as radio hearing or web searches. At the same time, the health care segment suffers in process inefficiency and missing staff, whereas the usage of IVA has the potential to improve caring processes and patient satisfaction. By applying a design science approach and based on a qualitative study, we identify IVA requirements, barriers and design guidelines for the health care sector. The results reveal three important IVA functions: the ability to set appointments with care service staff, the documentation of health history and the communication with service staff. Integration, system stability and volume control are the most important nonfunctional requirements. Based on the interview results and project experiences, six design and implementation guidelines are derived.
Adaptive logistics : information management for planning and control of small series assembly
(2007)
The following article deals with the basic principles of intercultural management and possible improvements in terms of cultural, ethnic and gender diversification. The results are exemplarily applied to a bank located in Germany. The aim of this paper is to find out to what extent intercultural management could improve the productivity of Relatos-Bank in dealing with foreign employees or employees with a different cultural background. To achieve this goal, the authors con-duct a literature research. The main sources of information are books, journal articles and internet sources. It becomes clear that especially the different perceptions of different generations have a potential for conflict, which can be counteracted by applying presented scientific models. Equalizing the salaries of female and male employees and equalizing the rights and distribution of power could also be the key to becoming an open-minded, dynamic and fair organization that is pre-pared for the rapidly changing environment in which it operates.