Research lines

Key Performance Indicators (KPIs) and Requirements Traceability

One of the key aspects for the successful implementation of business intelligence is modeling, monitoring and traceability of business strategy and the key indicators (Key Performance Indicatorso or KPIs) associated. Generally, little attention is paid to the relationship between KPIs and business strategy, as well as the influence between them. Thus, it is taken for granted, fundamental information, which can provide information about the reasons why the company’s performance is lower than expected. In our approach, we have this information modeled, allowing to the user to know the relationships between objectives and KPIs, and supporting also a complete traceability both for indicators and requirements. So, we can quickly and clearly identify what elements are affected by each indicator and what elements, including from tables in the warehouse up to visualization tools, would be necessary to change in case there is a change in the strategy of the company.

Open Business Intelligence (Open BI) and Open-data

Currently there is a growing interest in transparency in the public sector in the use of technology and new standards play a key role. The BI area extends their possibilities of data analysis for decision support systems beyond the private and local information of each company. One of the current problems is the the majority of “open data” are in heterogeneous and unstructured formats (pdf, html, csv, etc..). A research line in our group focuses precisely on transforming this data to standard and structured formats in order to be more easily treated by computer programs (RDF). In this line, our goal is to offer to the SMEs dashboards and access points to SPARQL queries of “open data” that interest them.

Methodology of data warehouse development

Traditional techniques such as databases or office applications are insufficient to facilitate decision-making in the business environment. This type of technology does not provide mechanisms to perform an exhaustive analysis of information. To achieve a business intelligence system it requires several elements: an integrated data repository commonly called data warehouse, cleaning procedure , data loading, and data analysis tools. One should keep in mind that the  designing process of  a business intelligence system should be hybrid, because unlike traditional software systems, the design of a business intelligence system depends not only on the requirements of the users but these should contrasted with the available data. This makes the design and implementation of these systems is extremely complex and expensive, being only available for large companies. Lucentia research group has worked in recent years at a less expensive development process that allows the design of a business intelligence system fully functional for any SME.

Data Visualization, Open-Data, RDF, and Unstructured Information Processing

A business intelligence application must include a dynamic visualization, reachable from anywhere and adapted to the strategic objectives of the company so that analysts and executives can take full advantage of it. Currently, the trend of offering public data (open-data) in open and standard formats (RDF) enables an analysis that goes beyond the particulars of a particular company. Finally, the use of unstructured information contained in social networks and media for decision support is also a growing interest feature in the area.

Security in Business Intelligence applications

The development of data warehouse and websites as explained earlier. The group has developed a method for incorporating security from early stages of development as historical data contained in the data warehouse and later accessed through various applications (user interfaces, web applications, etc.) are often very critical and they requires methods that incorporate security from the beginning. Nowadays, data security is critical since it is under a set of laws of different areas and the non-compliance of them it can lead to severe penalties for companies and public administrations. Therefore, Lucentia Group has been working to specify security aspects from the early stages of development and they are generated automatically in the final implementation of the data warehouse. In this research, Lucentia collaborates with ALARCOS Group at UCLM led by Professor Mario Piattini Velthuis. This collaboration has resulted in a considerable number of scientific publications as well as two Doctoral Theses co-directed by the professors Juan C. Trujillo and Eduardo Fernández-Medina.

Application Quality

Currently. the impact of software applications in the functioning of organizations, it requires that applications not only perform the tasks (functional requirements) but also they must made ​​them with certain quality criteria: efficiency, reliability, productivity, tolerance fault, maintainability. etc. These parameters of quality can be assured from the early stages of development, thereby decreasing the costs associated with a delayed correction. Our research in this direction is the identification and validation (theoretical and empirical) of measures at different levels of abstraction in software development process. helping to ensure the overall quality of the resulting application. Again, in this research Lucentia collaborates with ALARCOS Group at UCLM led by Professor Mario Piattini Velthuis. This collaboration has also resulted in a considerable number of scientific publications.

Data mining

In crisis times data mining, sometimes could even increase added value for companies which need to know all the information needed to deal with the current economic situation . Companies generate daily millions of data which are stored, and They could provide useful information when using data mining techniques. For example, find a customer segmentation for marketing specific. Through the use of these data warehouses it can be found hidden relationships and patterns among the data, or to find out where is the gap where our company is losing money. Lucentia research group has developed several tools to model these processes in an easy way and close to the final user, by implementing an application allowing to automate the creation of code for different data mining freameworks, with a very positive advantage which is the adaptability of that tool for any company platform. In addition,  it is noteworthy that data mining also works on companies with Internet presence : Web mining, which involves applying data mining techniques to documents and Web services in order to obtain valuable information about the behaviour of customers. For instance, companies that are selling on-line, if they study shopping behaviors through data mining, in real-time they can recommend other products that have high probability to be chosen together by the buyer and thus enhance sales.

Accessibility and Usability of Web interfaces

Web accessibility aims to ensure that applications and websites are usable by largest number of people, regardless of their knowledge or abilities independently of the technical characteristics of the equipment used for accessing the Web. To ensure accessibility, we have developed different standards or guidelines that explain how you have to create web pages for accessibility. An accessible website provides multiple benefits to their owners and users.

On the other hand, web usability aims to achieve that an application or website can be used by the greatest number of people in the easiest way. Web usability depends primarily on the ease of learning a system, its ease of use, flexibility and robustness.

Web Personalization and Automatic Generation Applications

One of the principles which maintain  Internet initiatives is that competitors are closer than ever, just one click away. Hence, the ability to loyal the user is critical to ensure business success. It is important to marketing techniques to create lasting relationships with customers. The key issue is listening to customers and to improve their user experience by presenting a personalized offer. Factors to customize will be both content Website structure and presentation. In order to support the complex processes of current Web application development, alternatives have been proposed for the construction and improvements of these applications through the use of methodologies, which systematize and ensure the quality of the development process by using Software Engineering techniques. Faced with this problem we present a development methodology besides high-level abstract models covering the development needs of these applications, providing automatic code generation. The main problem is that normally in medium and small companies that develop Web applications they do not follow any kind of development methodology and if used they are designed to improve the manual coding. Manual coding of Web applications is a time and human resource consuming task. The automatic code generation simplifies the most expensive phases of the development process of these applications (coding, review and maintenance), reducing the use of technical and human resources which have been used and, in some cases, they improve the quality of the final product.

Spatial Data Warehouses

Usually the analysis and exploration of multidimensional structures is related to spatiality. However, this spatiality has been just introduced to the different methodologies of data warehouse design. Our solution is to introduce spatial data at the conceptual level and to adjust them to different users and analysis requirements. Then, the data warehouse, both the repository and analysis tools (OLAP, data mining, etc..), are generated from these conceptual models using semi-automatic transformations. We currently have a prototype developed on the Eclipse framework that implements our model-driven methodology. Basically consists of different editors and transformation engine that get the codes and metadata necessary for the implementation of a data warehouse. Both the modeling and the different sets of transformations have been developed using standard languages ​​such as the UML, CWM, QVT, etc.. In this way we ensure the intuitiveness of the process, interoperability between models and support and adaptation to different platforms.

Data Visualization and Exploitation Tool Design

In recent years, the increase in the amount of stored information has made increasingly necessary to have data visualization techniques to identify patterns, trends and anomalies in an intuitive and fast way. While most of the available tools provide dashboards as well as different graphs as a solution to this problem, there is no methodology or approach that allows customers to communicate their needs for information visualization. Thus, the use and exploitation of the data stored is done in a suboptimal manner resulting a decision support less agile. In this research line, dashboards modeling is performed as well as other visualization tools from the point of view of user requirements, ensuring that the BI tools used meet the data operational and exploitation needs.

Most relevant publications