Informatica Economica Vol. XII No. 1/20082007 Grigore C. Moisil prize nomination 5 A Development Process for Enterprise Information Systems Based on Automatic Generation of the Components This paper contains some ideas concerning the Enterprise Information Systems (EIS) development. It combines known elements from the software engineering domain, with original elements, which the author has conceived and experimented. The author has followed two major objectives: to use a simple description for the concepts of an EIS, and to achieve a rapid and reliable EIS development process with minimal cost. The first goal was achieved defining some models, which describes the conceptual elements of the EIS domain: entities, events, actions, states and attribute-domain. The second goal is based on a predefined architectural model for the EIS, on predefined analyze and design models for the elements of the domain and finally on the automatic generation of the system components. The proposed methods do not depend on a special programming language or a data base management system. They are general and may be applied to any combination of such technologies. Neural Network Based Model Refinement In this paper, model bases and model generators are presented in the context of model refinement. This article proposes a neural network based model refinement technique for software metrics estimation. Neural networks are introduced as instruments of model refinement. A refinement technique is proposed for ranking and selecting input variables. A case study shows the practice of model refinement using neural networks. Knowledge Society and the flat World of Thomas L. Friedman The traditional society became lately the knowledge society. As the name states, knowledge is the most important asset of our times. There are different opinions on knowledge society and on globalization, but we will deal in this paper with Thomas Friedman’s flat world. The world got smaller with the developments in information and communication technology. Globalization has three periods and ten flatteners widely described by Friedman in his book. Technology changes the way we communicate, collaborate and share our knowledge. Explicit Description of the Input Data for the Program CRAC 2.0 Used in the Applications of the Credibility Theory In this paper a brief overview of the structure and the possibilities of the program CRAC 2.0 is given. It will be shown how sectors can be determined in order to use the hierarchical model that is built into the software. Furthermore a general structure for defining insurance problems to be solved by CRAC 2.0 will be discussed. Autopoiesis in Virtual Organizations Virtual organizations continuously gain popularity because of the benefits created by them. Generally, they are defined as temporal adhocracies, project oriented, knowledge-based network organizations. The goal of this paper is to present the hypothesis that knowledge system developed by virtual organization is an autopoietic system. The term “autopoiesis” was introduced by Maturana for self-productive systems. In this paper, Wikipedia is described as an example of an autopoietic system. The first part of the paper covers discussion on virtual organizations. Next, autopoiesis’ interpretations are delivered and the value of autopoiesis for governance of virtual organizations is presented. The last parts of the work comprise short presentation of Wikipedia, its principles and conclusions of Wikipedia as an autopoietic system. Models of Non – Life Insurance Mathematics In this communication we will discuss two regression credibility models from Non – Life Insurance Mathematics that can be solved by means of matrix theory. In the first regression credibility model, starting from a well-known representation formula of the inverse for a special class of matrices a risk premium will be calculated for a contract with risk parameter q. In the next regression credibility model, we will obtain a credibility solution in the form of a linear combination of the individual estimate (based on the data of a particular state) and the collective estimate (based on aggregate USA data). Mathematics Subject Classification: 62P05. Characteristics for Software Optimization Projects The increasing of the software systems complexity imposes the identification and implementation of some methods and techniques in order to manage it. The software optimization project is a way in which the software complexity is controlled. The software optimization project must face to the organization need to earn profit. The software optimization project is an integrated part of the application cycle because share same resources, depends on other stages and influences next phases. The optimization project has some particularities because it works on an finished product around its quality. The process is quality and performance oriented and it assumes that the product life cycle is almost finished. Identifying Data Affected by Aberrant Errors. Applied Program Statistical survey has become a very powerful tool for understanding reality and interpreting it and prediction. Nevertheless, even with the accepted margin for errors, a survey’s results may be inconcludent. This is mostly due to sample data quality. In this article, we refer to two tests to identify and eliminate aberrant errors, and at the end we present a program for applying these tests. Distribution of the Object Oriented Databases. A Viewpoint of the MVDB Model’s Methodology and Architecture In databases, much work has been done towards extending models with advanced tools such as view technology, schema evolution support, multiple classification, role modeling and viewpoints. Over the past years, most of the research dealing with the object multiple representation and evolution has proposed to enrich the monolithic vision of the classical object approach in which an object belongs to one hierarchy class. In particular, the integration of the viewpoint mechanism to the conventional object-oriented data model gives it flexibility and allows one to improve the modeling power of objects. The viewpoint paradigm refers to the multiple descriptions, the distribution, and the evolution of object. Also, it can be an undeniable contribution for a distributed design of complex databases. The motivation of this paper is to define an object data model integrating viewpoints in databases and to present a federated database architecture integrating multiple viewpoint sources following a local-as-extended-view data integration approach. EvalEdit – Online Editor for E-learning Tests As our society has gradually changed in the past few years with the new technology, the internet has become more and more present at our workplace and in our learning methods. The internet brought us an easier access to information offering a range of tools and capabilities to workers. A Conexionist Intelligent System for Accounting Neural networks are a computing paradigm developed from artificial intelligence and brain modelling’s fields, which lately has become very popular in business. Many researchers are seeing neural networks systems as solutions to business problems like modelling and forecasting, but accounting and audit were also touched by the new technology. Interferences Between e-Commerce and Insurance Internet use grew faster than any other technology in the last few years and it has a powerful impact on international commerce development. New opportunities appear for small and medium companies which use the internet to make commerce across the borders. Citizens save their time and money making payments on the internet, ordering goods and services using their home computers. With all this advantages, a wide variety of barriers show up and disturb the internet activity. Using the internet, companies can be affected by the losses of revenues, losses of information, security dates, reputation damage, interruption of activity, theft of private information, etc. To cover all this internet risks, insurers develop new products in order to meet all the company and citizens expectations. Stable Structures for Distributed Applications For distributed applications, we define the linear, tree and graph structure types with different variants and modalities to aggregate them. The distributed applications have assigned structures that through their characteristics influence the costs of stages for developing cycle and the costs for exploitation, transferred to each user. We also present the quality characteristics of a structure for a stable application, which is focused on stability characteristic. For that characteristic we define the estimated measure indicators for a level. The influence of the factors of stability and the ways for increasing it are thus identified, and at the same time the costs of development stages, the costs of usage and the costs of maintenance to be keep on between limits that assure the global efficiency of application. It is presented the base aspects for distributed applications: definition, peculiarities and importance. The aspects for the development cycle of distributed application are detailed. In this article, we alongside give the mechanisms for building the defined structures and analyze the complexity of the defined structures for a distributed application of a virtual store. Web-enabled Data Warehouse and Data Webhouse In this paper, our objectives are to understanding what data warehouse means examine the reasons for doing so, appreciate the implications of the convergence of Web technologies and those of the data warehouse and examine the steps for building a Web-enabled data warehouse. The web revolution has propelled the data warehouse out onto the main stage, because in many situations the data warehouse must be the engine that controls or analysis the web experience. In order to step up to this new responsibility, the data warehouse must adjust. The nature of the data warehouse needs to be somewhat different. As a result, our data warehouses are becoming data webhouses. The data warehouse is becoming the infrastructure that supports customer relationship management (CRM). And the data warehouse is being asked to make the customer clickstream available for analysis. This rebirth of data warehousing architecture is called the data webhouse. Strategy for selecting a Business Intelligence solution Considering the demands imposed by the knowledge society, each organization strives to become an intelligent organization and, by the means of new and innovative Business Intelligence (BI) strategy, to gain a market competition advantage. Therefore, within organizations became apparent the need for proactive, extensible, performance oriented instruments, with stronger impact than conventional reports, score cards, OLAP system and the year 2007 being the start of a new BI era. This paper intends to analyze BI tendencies and the selection strategy for a new BI solution within organizations. Research on Elaboration of an Integrated System Based on Xml Data Analysis This paper approach the importance of XML for organizing and managing better the data based on texts. This document provides the specification for a data model for describing information organization structures (metadata) for collections of networked information. As an important result we propose a new model of an integrated system based on XML and using the data analysis It also provides some steps we must follow for this data model using XML, the Extensible Markup Language. Building a dynamically ASP.NET 2.0 GridView control Microsoft Visual Studio 2005 (based on ASP.NET 2.0), the successor to Visual Studio .NET 2003 has a lot of new features and goodies designed for Web developers. This article show how a ASP.NET 2.0 control can be dynamically connected to Microsoft Access database. The delete and update operation will be implemented using a GridView control and SQL queries. The connection between the database and the .NET application will be made with OleDb Data provider, the new Access Data Source control. The SQL queries will be implemented with OleDbCommand. A Study Looking the Electronic Payment Market The aim of this paper is to make an analysis of the electronic payment market. We identified the most important characteristics of the electronic payment systems especially those mentioned by the European Central Bank. We used for this the companies’ websites, the Weka software and the k-means algorithm for data clustering. From Errors Treatment to Exceptions Treatment Regarding the Execution Control over Visual Basic Programs In order to comply with the quality standards and with the best practices, the execution of the professional programs must be rigorously controlled so that to avoid occurrence of unpredictable situations that might generate anomalies and could lead to computer blockage, forced termination of the execution and data loss. Analysis of the Romanian Offer of ERP Solutions This article tries to make an analysis of the actual Romanian offer of ERP (Enterprise Resource Planning) solutions. In the first part of the article the level of ERP adoption by Romanian companies having as main sources two studies published in 2006 about the ERP market in Romania, namely: “ERP Romania 2006” by Pierre Audoin Consultants (PAC) and the second one made by the media company Agora Media and the market research company Sensimark. In the second part of the article a comparative analysis of the first seven best placed ERP products on the Romanian market will be made. In order to accomplish this analysis a synthesized table has been included with all 7 products together with the analysis criteria grouped in three categories: function, technical characteristics, and market segment that it addresses. The Replication Mechanism in a Romanian ERP System Environment Modern Relational Database Management Systems have Replication technology. Large enterprises are often spread around the country, and although WANs can be very fast and reliable these days, it is often better for each location to have a local copy of data rather than have a single database at a central location. This usually means that replication is a requirement in order for each location to have the most up-to-date data. This paper reveals a replication mechanism implemented in a Romanian ERP system environment. Designing algorithms using CAD technologies A representative example of eLearning-platform modular application, ‘Logical diagrams’, is intended to be a useful learning and testing tool for the beginner programmer, but also for the more experienced one. The problem this application is trying to solve concerns young programmers who forget about the fundamentals of this domain, algorithmic. Logical diagrams are a graphic representation of an algorithm, which uses different geometrical figures (parallelograms, rectangles, rhombuses, circles) with particular meaning that are called blocks and connected between them to reveal the flow of the algorithm. The role of this application is to help the user build the diagram for the algorithm and then automatically generate the C code and test it. A Linear Algorithm for Black Scholes Economic Model The pricing of options is a very important problem encountered in financial domain. The famous Black-Scholes model provides explicit closed form solution for the values of certain (European style) call and put options. But for many other options, either there are no closed form solution, or if such closed form solutions exist, the formulas exhibiting them are complicated and difficult to evaluate accurately by conventional methods. The aim of this paper is to study the possibility of obtaining the numerical solution for the Black-Scholes equation in parallel, by means of several processors, using the finite difference method. A comparison between the complexity of the parallel algorithm and the serial one is given. Medical Virtual Public Services The healthcare enterprises are very disconnected. This paper intends to propose a solution that will provide citizens, businesses and medical enterprises with improved access to medical virtual public services. Referred medical services are based on existing national medical Web services and which support medically required services provided by physicians and supplementary health care practitioners, laboratory services and diagnostic procedures, clinics and hospitals’ services. Requirements and specific rules of these medical services are considered, and personalization of user preferences will to be supported. The architecture is based on adaptable process management technologies, allowing for virtual services which are dynamically combined from existing national medical services. In this way, a comprehensive workflow process is set up, allowing for service-level agreements, an audit trail and explanation of the process to the end user. The process engine operates on top of a virtual repository, providing a high-level semantic view of information retrieved from heterogeneous information sources, such as national sources of medical services. The system relies on a security framework to ensure all high-level security requirements are met. System’s architecture is business oriented: it focuses on Service Oriented Architecture - SOA concepts, asynchronously combining Web services, Business Process Management – BPM rules and BPEL standards. SECIT08 Announcement 162 INFOREC Association 163 |