open source software in the Valencian Regional Ministry of Infrastructure and Transport
|
|
|
- Liliana Washington
- 10 years ago
- Views:
Transcription
1 The project EUROPEAN UNION EUROPEAN REGIONAL DEVELOPMENT FUND A way to build Europe Complete migration to open source software in the Valencian Regional Ministry of Infrastructure and Transport
2 Complete migration to open source software in the Valencian Regional Ministry of Infrastructure and Transport
3 IndexPart 1 Corporate and 7 web developments Chapter 1 Overview 8 Coordinated and published by Legal Deposit M Translated by Crown Communication, S.L. C/. Archiduque Carlos, nº65, Valencia Design and layout Edit Lin Editorial, S,L Avda. de Portugal, 85-local Madrid Chapter 2 Chapter 3 Chapter 4 Chapter 5 Chapter 6 gvdades: experiences with database management systems 11 gvmétrica and MOSKitt: definition of a development and support methodology 17 gvhidra: development of a framework for PHP 22 Implementation of version control systems: CVS and Subversion 24 Implementation of a report generation tool 25 Licence This publication is distributed under the Creative Commons Attribution - Share Alike 2.5 Spain licence. To see a copy of this licence, go to licenses/by-sa/2.5/es/ Chapter 7 Migration of the web portal and intranet 28 Chapter 8 Workflow for business process management 31 Chapter 9 gvadoc: document management system 33
4 Part 2 Operating systems 37 and communication Chapter 10 End user PC environment 38 Chapter 11 Local network server environment 40 Chapter 12 Networking and communication environment 43 Chapter 13 Corporate server environment 44 Part 3 GIS and CAD: gvsig 47 Chapter 14 gvsig: introduction 48 Chapter 15 Chapter 16 gvsig: description and justification of our initial situation 49 gvsig: how it has evolved into the current solution 52 Chapter 17 gvsig: conclusions 63 Chapter 18 gvsig: future lines of work 64 References 64 Glossary 65
5 INTRODUCTION gvpontis, a success story gvpontis is the name we have given the project to migrate all the Valencian Regional Ministry of Infraestructure and Transport s (CIT) Information Systems from commercial to open source programmes. The idea behind the project goes back to 2003, a time of major upheaval in programme licensing costs. This change in commercial software sales strategy, together with the CIT s policy of having legal licences for all our users, meant a considerable increase in licence costs which became unsustainable as the majority of our budget went on acquiring these licences. In addition, many attempts had been made to obtain accurate, centralised information through corporate information systems but this had to take into account the fact that our staff used a huge range of computer tools in their daily tasks and these all needed to be brought under the same umbrella. To give you some idea of the scale of the project, around a thousand members of staff use the different information systems. Six hundred of these staff work in general administration whilst the other four hundred work in more specialist positions, mainly as engineers and architects. These staff are divided between the CIT s headquarters, three regional offices and some area offices. In the light of this situation, we made a proposal to the Valencian Directorate General for Modernisation to migrate all our tools and systems to open source software. The proposal was accepted and the migration plan was authorised in September The last quarter of 2003 was spent studying the technical feasibility of the project. The University of Valencia and the Universidad Politécnica de Valencia were entrusted with several studies about the open source software alternatives on the market at the time. Several SMEs which used these new technologies were also given the task of developing small-scale computer applications using the new concepts. All the reports and applications indicated that there were no dangers or reasons not to start developing the project. All these reports are available on the project s web site: The only real risk was the technological change which needed to be implemented. This was our only fear but this will be better illustrated in the report which follows. Finally, in January 2004, gvpontis was begun. The project was to be carried out over a period of four years and was to include the migration of all the CIT s information systems. It was to be carried out in an orderly, gradual manner based on two principles which were to govern the whole process: dd The project was not to be allocated any additional resources. Only resources already allocated to the Organisation and Computing Department were to be used. 4
6 dd The negative effects of the project on the CIT staff s daily work were to be kept to a minimum. The work itself is more important than the means. Once the project was started, it was found that even the most rudimentary tools evolved to a more than acceptable standard. The progression has been almost geometric and the implementation of the interoperable systems, open standards and open source tools and software are already on an even footing in our information systems. As of 2008, commercial software is only used more than open source programmes for multimedia and other similar tasks. As explained below, we have encountered problems, we have managed to solve most of them and others are still pending. We have spent time on training, which is so essential and needs to be a constant in the computing business, and have trained the users and above all the departmental staff. We must admit with hindsight that the major problem over the years has been the fear of change. This apprehension over unbuffered technological innovation has proved to be this project s worst enemy development-wise. Tackling fear is also a challenge yet we have learned from our own experience that a challenge can always be taken on board. In our case, we have faced up to the challenge with well-laid plans, training and an alternative plan of action just in case. We will have finished the migration process by December There will still be a few remaining areas and some obsolete applications which have not been included in the migration process but only because this is not an economically attractive proposition. Time will do away with these leftover applications which do not affect our day-to-day work because we have managed to include them in the new working environment. the past four years. This also includes our technical experts and our clerical staff. They have all helped to overcome the problems that have arisen. There is no room to list them all here, but my sincere thanks go out to all the Department s own staff and to our external partners. To the successive Administrative General Secretaries who have worked at the CIT over the years and have supported and on some occasions protected the project. To Isabel Villalonga whose characteristic energy gave the project a considerable boost at the start. To José Manuel Palau who also saw how important the project was. And to Ana Climent, who has suffered the project the longest but has always championed it. During these four years, the two Regional Ministers have also influenced the project: José Ramón García Antón, who passed the project at the start and who had to put up with it on several occasions, and Mario Flores, who joined the project when its teething problems were over but who has been involved in the final definitive part of the project. Only one person has been at the helm of the project during these four years. Gaspar Peral, the CIT s undersecretary, has been the continuous driving force behind gvpontis. Without his support and help, this report would not be telling a success story, it would be recounting a failed or unfinished project. Martín García Hernández Head of the Organisation and Computing Department We have been told that this is what is known as a success story. Yet we would prefer to call it a unique professional experience. In conclusion, there are a lot of people I am extremely grateful to. First and foremost, I would like to thank all the staff in the Organisation and Computing Department. They have worked extremely hard over 5
7 Part 1
8 Corporate and web developments gvpontis includes a wide variety of application development projects and tasks, a brief summary of which will be given below as an overview of the main lines of work undertaken. We shall then move on to give a more detailed explanation of each one in the subsequent sections. 7
9 CHAPTER 1 Overview In line with the framework of the gvpontis project and based on the proposals put forward after the preliminary studies in corporate and web developments, plans were started to implement a project to migrate to open source software so that the development of new applications would be based on this type of software. The migration of applications was influenced by the simultaneous decision to modify the client/server development architecture which was used at the time the project begun to a three-layer and later multi-layer architecture. Initially this change of paradigm introduced another element of uncertainty in the project. However, over time this has been beneficial as a result of the abstraction it has produced when programming each of the layers the information system is divided into - presentation, business and data. It has made our developers more specialised and has defined our information systems functions more clearly. The two working tools used to develop applications in the client/ server environment at the start of the migration project were Oracle Developer and PowerBuilder. However, in order to be able to use three-layer architecture, we incorporated PHP and Java for the new application developments. It was decided that the applications which were already being developed in the client/server environment would be implemented in three layers and that Oracle Developer, the language they were initially programmed in, would be maintained. These applications would gradually be converted to PHP or Java languages according to the features of each one. Data base management systems (DBMS) are one of the cornerstones of the gvpontis project. Thus, the gvdades project was developed to integrate the research and implementation efforts in this field. The Valencian Regional Ministry of Infrastructure and Transport s (CIT) departmental applications were developed in Access whilst its corporate applications used Oracle. As there were no sufficiently well developed alternatives to migrate to in the case of the Access applications, a decision was taken to develop new applications in PHP, PosgreSQL and MySQL. It was decided that Oracle should be maintained in the corporate developments already underway and that new applications should be developed independently from the data base and that they should at least be able to run on PostgreSQL. MySQL was also to be used to support the CIT s web portals because of its user-friendliness. The existence of numerous other developments, as well as the need to deal with other Valencian Regional Government ministries which used Oracle, meant that co-existence with Oracle on Linux servers was essential. Thus, the decision was taken to migrate from Oracle 8i to Oracle 10gR2. Another important area which needed to be sorted out was how the applications using two different DBMS, one of which was Oracle, could obtain data. We found ourselves having to develop a tool that carried out replications at table level. This solution is currently being replaced by using WebServices. 8
10 Overview 1 We would also like to mention the problems we have still not found an optimum solution for. Our main lines of work centre on these problems which include the search for better PostgreSQL management and monitoring tools, and security management and access control improvement. Previous paragraphs have mentioned the technological changes implemented in the migration project. However, cultural change was also necessary to streamline the interaction of different profiles involved in the different development phases both as a result of the migration of a large part of the application inventory because of the reengineering they underwent as they had been developed a good few years before and to ensure that the new developments the CIT might need would stand the test of time. This created an imperative need to adopt a development methodology that, amongst other objectives, would allow us to: dd Generate a working method whose tasks were clearly de- fined and assigned to different collaboration profiles. This would allow us to integrate the work carried out by the Organisation and Computing staff who took part in each of the development projects. dd Homogenise the output product documentation that each of the tasks transmitted to the next in order to make the projects more efficient. Based on the recommendations of the document of conclusions, the methodology chosen was MÉTRICA as it supports the software s whole life cycle and covers object-oriented development. Thus, the gvmétrica project was begun with the aim of adapting MÉTRICA Version 3 to the needs of the CIT. As the project advanced and was implemented in the analysis and design phases of new applications, the need to find a CASE 1 tool which supported its implementation and reduced the use of our own templates as much as possible became evident. This was the seed for gvcase, a subproject of the gvpontis project which aimed to develop MOSKitt, an open source CASE tool developed on the Eclipse platform. 1 CASE (Computer Aided Software Engineering) tools are computer applications aimed at increasing productivity in software development in all aspects of its life cycle. At the programming stage, it is important to have a framework 2 when numerous applications are going to be developed. This led to the creation of gvhidra, another gvpontis subproject whose objective was to become the working framework and basis for the development of PHP management applications in web environments, as per the CIT s Style Guide 3. The decision as to which framework to use for the development of Java applications is still under study, although initial trials were carried out using Struts and Turbine. We are currently working with a modular combination of JSF in the presentation layer, Spring in the business layer and Hibernate in the data persistence layer. gvcase, a subproject of the gvpontis project which aimed to develop MOSKitt, an open source CASE tool developed on the Eclipse platform One of the most important functions required when several developers coincide in the development of an application is having a version control system which uses a common repository to store the programmed files and compare changes that have been made in the different versions. Initially, CVS was adopted. However, Subversion is currently being implemented because of the advantages it has over CVS as it includes permission control to access projects and improve file versioning. An Integrated Development Environment was required to make application programming easier. Eclipse was the most mature open source IDE 4 at the time the decision was taken. Its advantages 2 Working environment which groups together a predefined set of programmes, libraries and interpreted language which help to develop and standardise the different parts of a project. 3 A guide to streamline appearance and usability criteria in the application development process. 4 Programme composed of a set of tools which has been packaged as an application programme, i.e. it has a source code editor, a compiler, a debugger and a graphic user interface builder. 9
11 Complete migration to open source software include its large community of users and the use of plug-ins 5 which allow programming in different languages and the extension of its functions. Up until now, we have simply enjoyed the advantages of the application and of its evolution which has brought with it ever-improving features. Time has confirmed that we made the correct decision in choosing this tool. The MOSKitt project is being developed as an Eclipse plug-in. Another aspect to be taken into account was the generation of reports in open source code. The CIT selected the ireport tool which includes the JasperReports engine. This has shown itself to be a tried and tested, highly reliable ally. ireport is a Java-based multiplatform tool whose major advantages include the ease with which highly detailed reports (graphs, pivot tables, etc.) with dynamic content can be generated and the multiple formats it can export the information to (XML, PDF, etc.). Data can be extracted from a wide variety of sources, including all data bases. It also includes a viewer which allows users to preview their reports on screen. Finally, the web portal and the intranet were also part of this migration process. The information was redesigned and restructured using Typo3 as the content manager. Solutions are still to be adopted for the development of fill-in forms in PDF and for some aspects of advanced image editing not covered by GIMP. 5 Computer application which interacts with another application to provide it with a specific function or utility. It is a way of expanding programmes by modules so that new functions can be added without affecting existing ones or complicating the development of the main programme. In short, the main lines of work undertaken up until now in the field of corporate and web developments, in the framework of the gvpontis project, are listed below: >> General: dd gvdades: work with data base management systems. dd gvmétrica and MOSKitt: definition of a development methodology and support for it. dd gvhidra: development of a framework for PHP. dd Implementation of version control systems: CVS and Subversion. dd dd Implementation of a report generation tool. Migration of the web portal and intranet. >> Specific lines of work have included (main examples): dd MASTIN migration. Workflow for business process management. dd gvadoc Document management system. Further details on each of these lines of work are given below. 10
12 gvdades: experiences with data base management systems CHAPTER 2 When the gvpontis project to migrate to open source software was started in the CIT it was divided into various parts. One of these parts gave rise to the gvdades project which aimed to tackle the migration of the different data base managers to open source software. Two database managers were mainly used: d d Oracle as a data container for corporate applications. The applications that used this data were developed using tools such as PowerBuilder and Developer. d d Microsoft Access for departmental and user applications. This was used both as a data base and to develop queries, reports and applications via the actual tool. We should stress here that the actual users requested the use of Access as some of them were well versed in the programme. ODBC was used to connect to data bases from different systems to capture fields which were then included in the generation of documents, spreadsheets, etc. The main problems we needed to solve here were: 1. Finding alternatives to the use of Microsoft Access as a departmental data base. At the time, there were no alternative office management programmes. However, it was found that as a data base manager it could be connected to MySQL or PostgreSQL via ODBC. This meant that these programmes could be installed in the users computers or in servers if they needed to share data. 2. Finding alternatives to the use of Microsoft Access as a development tool for applications, queries, reports, etc. MS Access had two advantages:»»»» end users found it easy to use. Thus they could define their own tables, queries and reports. it was easy to develop and implement small departmental applications. At the time, no other alternative which could meet these requirements was found. Applications had to be developed using other tools, such as PHP and data bases had to use PostgreSQL or MySQL. 3. Finding alternatives to the use of ODBC/OLE for access from applications such as Word, Excel and in-house developments to the data bases. 11
13 Complete migration to open source software In this case, there was no problem deciding because there was a mature alternative called UnixODBC and it was easy to make ODBC connections from applications that supported it, such as OpenOffice. Thus it was installed in the computers that needed it using the ODBC drivers provided by the manufacturers. Standard JDBC drivers were used for the Java applications that supported them code in the data base, we had no choice if we were to maintain the existing applications it had to support large objects (raw, long, lob, etc.) it had to have software to connect and manage data bases. 4. Finding alternatives to the corporate data base system (Oracle), and its use on Linux. Two major objectives were set out: dd Oracle had to coexist with Linux. We had to make sure that the operating system Oracle ran on was open source. We were aware that we could not totally eliminate the use of the Oracle data base for two reasons: the existence of old developments the relationship with other Valencian Regional Ministries that use Oracle. dd The choice of an alternative DBMS with the following features: -- it had to comply with corporate DBMS requirements (transaction manager, queries, user management, storage management, integrity, source code in data base, scaleability, portability, performance, availability, etc.) the existence of ODBC/JDBC clients the existence of software to connect to from Windows clients when migration was not possible it had to allow data migration 2.1 Analysing the alternatives and making a choice A study 1 was carried out to compare the three main data bases existing at the time: PostgreSQL, MySQL and Interbase. A technical comparison was made and an analysis of how Oracle data could be migrated to these other managers was also carried out. dd Interbase 6.0. Was stable, fast, scaleable and sufficiently functional. However, documentation was scarce and we felt that Borland should have provided more support for this project so it could be successful in the open source software community. d d MySQL 4. Was user-friendly, fast and the most widely used. It was well documented and supported yet it did not support many essential functions (ACID transactions, referential integrity, stored code, triggers, etc.). d d PostgreSQL 7. Seemed to be the most comprehensive solution. It was supported by the open source software community, supported transactions and concurrence that are ACID compliant, referential integrity and sequences and it included procedural languages, backup system and fault tolerance. In addition, the added value of this data base was that it had a reasonably mature module, PostGIS, which maintained geospatial data which are highly important to our ministry. -- it had to allow source code migration. Although we wanted to carry out the developments without a source 1 V. Comparative study of different DBMS under Linux. 12
14 gvdades: experiences with data basa management systems 2 This study led to the decision to implement PostgreSQL version 7.3 as an alternative DBMS to Oracle. However, we continued to monitor MySQL as we imagined that more functions would be added to it as it is the most widely used tool. 2.2 Implementing the chosen solution Once we had taken the decision to choose PostgreSQL, we installed this data base on two servers, one for development and another for production. We tried out some initial developments so we could become familiar with the features of this data base and test how well it worked. We took the decision to use PostgreSQL for: dd all new developments, using it as our new OLTP data base server dd the new document file data base we were going to develop (V. gvadoc) dd the GeoSpatial data base using PostGIS (essential for gvsig) dd we later used PostgreSQL as a repository for the gforge tool. We then began a phase of adaptation in which: dd we had to take the incompatibilities/compatibilities between Oracle and PostgreSQL into account. We discovered many of them whilst we were actually developing the programme, such as different names for data types or commands which worked differently from one system to another dd we established the way the applications had to work with the data base (definition of data bases, schemas, data base objects, users, groups, tablespaces, security management, etc.) dd we developed procedures to maintain the data base (startup, shutdown, backups, support, session management, etc.). We have continued to make progress with PostgreSQL in its successive versions, from version 7.3 to version 8.2 (including versions 7.4 and 8.1). We have had no problem with updates when migrating from one version to the next Data replication problems One of the main problems we encountered was that on one hand our new developments needed to access their own data, which was created in the new PostgreSQL data base but on the other, they also needed to access other data which was in other Oracle data bases. What could we do if we needed data which were in another data base or we needed to combine tables which were in other data bases? This problem would not have occurred if we had had an intermediate data access layer, such as WebServices or Hibernate, but at the time this layer did not exist. In the face of this problem, we considered two solutions: 1. That the new applications should maintain several connections, one to the new data base and one to the old data base to recover the data they needed. 2. To replicate the Oracle data base tables in the PostgreSQL data base when large sets of data needed to be worked with in both data bases. We had to take the incompatibilities/ compatibilities between Oracle and PostgreSQL into account Both Oracle and PostgreSQL feature the dblink concept, i.e. data base links which allow data to be related between different data bases, as long as they belong to the same manufacturer, but not between different ones. We thus began to replicate data by dumping complete data sets from one data base to another. This was not a satisfactory solution for the following reasons: 13
15 Complete migration to open source software dd They were complete dumps, i.e. the data in the target table was deleted and they were dumped again. This was fine for small tables but if there were many large tables, the system could become inefficient. dd As the data was deleted from the tables, foreign keys could not be defined for the tables as this prevented the data from being deleted. dd As they were complete dumps, they ran once a day. This meant that we did not have the instant, necessary or advisable replication in some cases. We have recently developed a tool to maintain replication at table level, both for complete and fast refreshes. This basically consists of defining a repository of master tables and slave tables, a trigger system which is used for fast refreshes and programmes in PHP which carry out the refreshes and control the replications. This system is already up and running and it is used for the cases we are not able to resolve with WebServices Oracle migration to Linux servers In parallel, we also started a study to be able to use Oracle on Linux servers. The version we had at the time was version which was not certified for the Linux versions we were using (Suse 8). We thus began migration from Oracle 8i to Oracle 10gR2, which is specifically intended for Linux servers. Our development servers currently use Oracle 10g on RedHat ES4 and we expect to move on to our production servers in the near future Internationalisation MySQL We also implemented MySQL4 on a small-scale thanks to the acquisition of SUN s Typo3 and Identity Manager which use this data base as a repository. This has allowed us to see the functions of this data base engine Client access to the data bases Another problem that occurs when an environment is changed is that the tools that ran perfectly on a Windows environment with Oracle, such as SQL*Plus, TOAD and Enterprise Manager, do not now run on a Linux environment with other data bases. The new tools must comply with some of the following requirements: dd Run on Linux. dd Connect to Oracle, PostgreSQL or MySQL (depending on the tool). dd Have functions to create tables, views, source code. dd Have administration and monitoring functions. dd Be open source or free software. Our development servers currently use Oracle 10g on RedHat ES4 and we expect to move on to our production servers in the near future We have reached PostgreSQL version in successive data base version migrations. We used these migrations to tackle the subject of character set internationalisation. We can currently support UTF8 when we previously only supported ISO (Latin1). In our search the experience has been good. Although we have not found any products as comprehensive as TOAD (or other commercial tools), we have managed to find something similar. The following Table 1 shows the clients we have worked with: 14
16 gvdades: experiences with data basa management systems 2 Table 1. Clients we have worked with. Client SGBD DESCRIPTION pgadmin3 PostgreSQL»» Useful for programmers and poor for administrators.»» Has a web version in PHP (phppgadmin3).»» Runs on Linux and Windows Training When the migration project began, a course was organised at the Universidad Politécnica de Valencia on PostgreSQL administration. Since then, we have made progress on our own by consulting the documentation and practicing. The CIT is currently ready to prepare a course on data base administration using PostgreSQL. The course manual is available on the gvpontis web site Difficulties and problems SQLDeveloper Oracle MySQL»» Oracle s own, certified to connect to versions 9.x and 10.x.»» Requires JDK.»» Runs on Linux and Windows.»» Useful for programmers. We have come across some problems during this process, some of which have not yet been solved: dd A lack of GPL tools to administer and monitor PostgreSQL. Squirrel Oracle MySQL PostgreSQL»» GPL, plugins can be made to add functions.»» Created in Java, runs on Linux and Windows.»» Connects via JDBC to any DBMS. dd A lack of companies which provide qualified support for PostgreSQL. A solution is on the way for this problem. dd Security management and access control. As PostgreSQL does not have fine control mechanisms like Oracle Label, Oracle Vault, etc., security implementation has to be done using our own programmes. We have not managed to develop these programmes yet. isqlplus Oracle»» SQL*Plus web client to connect to Oracle. dd Loss of product integration. Compared to Oracle which has a huge number of complementary products, PostgreSQL only has the data base. The fact is that we have worked satisfactorily with the first three and although they do not have as many functions as TOAD, it is a question of personal preference over which one to choose. Another aspect to bear in mind was the data base administrator s point of view. In the case of Oracle 10g there were no problems as Oracle provides a very powerful web interface for Enterprise Manager 10g. However, in the case of PostgreSQL we did not find any free GPL tools that convinced us, especially in the field of data base monitoring. We obviously imagined these difficulties when we started the project and we accept them Impressions of the migration The migration has left us with a positive impression. We have managed to work with PostgreSQL and MySQL and have managed to solve the few incidents we had by turning to the knowledge and services of the users community to a great extent. 15
17 We have not taken the change lightly but have not been afraid to go through with it. It should be remembered that the change to open source software requires a change of mindset. We cannot expect to obtain the same features in PostgreSQL as we have in Oracle. However, we can see that the migration is proving useful to us. We accept certain losses which are balanced out by certain benefits both of a financial nature and in collaboration in projects which are of interest to society and to the CIT. 2.3 Future lines of work dd Improvement and possible migration of the PHP replication system to Java. dd Creation of a data warehouse system using PostgreSQL with support from Mondrian and JasperETL. dd Integration of alphanumeric data bases with gvsig s ge- ospatial data bases. dd Analyse whether it is feasible to use MySQL5. This data base is improving in leaps and bounds especially now that is being backed by SUN. dd Implementing a system which allows us to monitor data bas- es using open source products, such as Cacti or Munim. dd Developing a security module for the applications which manages user access and privileges, audits access and runs independently from the data base. dd OpenBase2 is going to be used as the open source alterna- tive to Microsoft Access. dd Selecting companies that are willing and prepared to work with us and provide PostgreSQL support to improve features, i.e. implementing data base clusters, high availability, etc. 2.4 Recommendations We would like to make some recommendations based on our experience: dd If Access or similar programmes are used as data base en- gines in small data bases, they can be replaced by MySQL or PostgreSQL as these two programmes are easy to install and cost nothing in licence fees. dd If there is a need to create many departmental data bases, for example, an organisation which has many work centres and requires a data base for each one, PostgreSQL can be installed. This is easy to administer remotely from a central node and also saves on costs. dd Implementing these systems in a large organisation, which is our case, implies the co-existence of different data base systems which must be able to work with each other. In this case, it is very important to remember that this change does not only affect data bases. Software developments must also be modernised to make up for the disadvantages of not using a sole software service provider. Developments must be separated into layers to prevent coupling. dd We should not be afraid of innovation. These data base managers have the majority of the functions required in many environments. We would like to encourage universities to use these data base managers for teaching and for their work, and also encourage technology-based firms to use them so that they can see that this software offers business opportunities by developing improvements or providing support. dd Asking ourselves what we want our data bases to do for us can avoid us having to pay out for high licence fees and support for these applications, if we opt for the existing open source alternatives that offer the features we need. 16
18 gvmétrica and MOSKitt: definition of a development and support methodology CHAPTER 3 One of the main objectives of the gvpontis project set out in the document of conclusions was the definition of a comprehensive working environment which was suitable for the type of developments required by the CIT 1. Basically, the two main lines of work to be followed in this area were: dd The definition of a software development methodology adapted to the CIT. dd The definition and implementation of the supporting architec- ture for the proposed developments. These lines of work covered a huge area. Thus several projects were created within the gvpontis project to tackle them. These projects had to be well coordinated since the definition of the development methodology had to adopt the conclusions reached with regard to their technical constraints and operational requirements as imposed by the CIT Support Architecture. The project to be developed under the first line of work was called gvmétrica. The following criteria 2 were agreed before work was started: dd The methodology chosen had to be an adapted version of MÉTRICA Version 3 3. dd Although the proposed methodology had to cover the whole development process, the adaptation of the analysis and design phases were given priority. dd Object orientation techniques were encouraged in these two phases. Thus, the objective we wish to meet is to define an approach for the CIT and the companies that work with it, which is particularly suitable for building various similar information systems which automate administrative procedures and other processes designed by the CIT s process engineers Developing the gvmétrica Project This project was tackled in stages, which have gradually extended the range of the project. 1 See section C.2. Alternatives for new Corporate Developments and Web Developments using Linux in the concluding document. 2 The justification for each of these decisions can be found in the concluding document put together by the Linux group. 3 MÉTRICA Version 3 is a method developed by the MAP (Public Administration Ministry) which must be followed in the civil service. It divides the development process in the following phases which are also called processes: Planning, Feasibility Study, Analysis, Design, Building, Implementation and Maintenance. For further information, go to 17
19 Complete migration to open source software The first phase consisted of creating a working team made up of members of the CIT and the Universidad Politécnica de Valencia. This team adapted MÉTRICA s analysis and design processes. This gave rise to the first version of gvmétrica. The work covered in this phase included: dd Selecting the MÉTRICA Version 3 activities and tasks which were deemed to be necessary in the CIT in its analysis and design processes. dd An interface was designed between the Organisation and Computer staff which included all the department s members in the development process. The initial tasks in the analysis phase belonged to the Organisation department which transferred knowledge about the process to be developed to the Computer Department. dd The techniques and standards to be applied were defined. It was decided that event analysis would be the main analysis technique, using UML as the specification language. A Style Guide was defined which was to be used for all the CIT s applications. dd The gvmétrica process was documented. The input and output products, the participants profile in the task and the techniques to be used in the design of each product were specified for each task. dd The set of handouts or templates which support the contents of the products that flow in the process were created. dd The method was integrated with the collaborative working envi- ronment defined for the CIT by the gvpontis project. In general terms, this consisted of using gforge to manage projects and Plone as a content manager to publish project documentation. A test phase was begun to validate the methodology. In this phase, we realised that it was impossible to tackle the developments only using the templates defined by gvmétrica because managing the documentation (publishing and updating it) was often more costly than the analysis itself. A problem with the templates also became apparent. There was too much information to fill in which was not always necessary. Thus the decision was taken to review its publication to make it easier to learn. This gave way to the need to start a new phase in the project. The main objective of this new phase was to make it easy to apply the methodology. To do this we had to (1) provide the CIT with tools to be able to apply the methodology more efficiently, (2) define the minimum requirements that analyses in the CIT had to comply with in order to balance efforts across the different phases and (3) improve the process documentation to make it more user-friendly and thus make it easier to apply the methodology. The work carried out during this phase included: dd Establishing UML as the standard to use when specifying the information systems. dd Giving greater weight to UML models to the detriment of templates. dd Defining which UML language elements were to be used to model the CIT s information systems. Specifying the UML profiles needed to adapt the UML models to the CIT s information systems and define the gvmétrica template subset which was to be replaced by these models. dd 4 Selecting PowerDesigner v.11 as the analysis tool mainly because it was well known in the CIT. It had been previously been widely used by analysts in defining and implementing relational data bases from conceptual models based on entity-relationship models. Although this was a commercial programme, it was valid as a temporary solution because it avoided having to include another new element and allowed us to tackle this issue at a later date. 4 It supports UML
20 gvmétrica and MOSKitt: definition of a development and support methodology 3 dd The full structure was defined applying the criteria set out in gvmétrica. Any project modelled in PowerDesigner must have this structure. dd The aforementioned UML profiles were implemented in the tool and the mechanisms were provided to complete the analyses linking the models to the templates which could not be replaced. dd Finally, user and programmer-oriented reports were defined. When we reached this point, we decided that the methodology was mature enough to be applied in the CIT and that we had tools to do it, even though we knew there were areas which needed to be completed and others which could be improved. Thus a monitoring phase was initiated to detect errors and above all to prioritise improvements. This task continued throughout the lifetime of the project. Regular meetings were held and all those involved in the project were directly monitored and supported. In parallel, many of the projects started in gvpontis also reached their first version, as was the case of gvhidra (implementation of the CIT s Style Guide in PHP) which meant it was possible to tackle the integration phase of the methodology with the support architectures the CIT s developments were to be built on. Thus: dd The UML model catalogue was extended to include a UML profile which facilitated the definition of the user interfaces, based on the patterns defined by the Style Guide and implemented by gvhidra. dd A new UML profile was defined to be applied in the develop- ments which aimed to automate administrative procedures using a Workflow engine. Following the approach defended in MDD 5 (Model-driven Development), and more specifically in MDA 6 (Model-driven 5 MDD is a software development approach based on software system modelling and their creation using these models. 6 MDA is an OMG (Object Management Group) standard which promotes MDD Architecture), we tried to maintain the models at an abstraction level which allowed information systems to be described as independently as possible from the platform they were to be built on. Unfortunately, this is not always possible because the guides which include all the rules the programmers must apply to transform all the information that they receive in the models into source code are not always available. This has forced analysts to include this implementation information in the diagrams (the models) to be able to get it to the programmers. The project is currently at this point (the generation of implementation guides). In parallel, the tool chosen to apply the methodology was reviewed for several reasons: dd The use of PowerDesigner, a commercial tool which only works on Windows was not coherent with gvpontis s general strategy. dd The difficulty of tackling the whole process proposed by gvmétrica only using UML models. UML and the defined profiles were insufficient to specify the systems user interfaces. Moreover, some of the templates still have to be defined for OpenOffice documents. dd A theoretical base was built to document the different UML models deemed necessary to complete the analysis of an information system using PowerDesigner. These models were interrelated so that as a whole they completed the information which corresponded to the different concepts set out by the methodology. There was a model to specify the system s functions, a model to define the entities required to provide these functions, a model to determine the graphical interface the user employs to access these functions and how this affects the system s different entities, etc. However, these relationships between the different models could not be defined in PowerDesigner and thus these models could not be built based on others. They obviously could not be automatically and groups together several languages which can be used to follow this approach (UML is one of these languages). MDA explicitly aims to increase the level of abstraction when describing information systems. 19
21 Complete migration to open source software synchronised with each other either. The main problems arising at this point were that they undermined consistency and the correction of the analyses carried out using the methodology, in addition to creating an extra cost for the analysts (model editors). dd At the present time, the majority of our staff are familiar with this methodology. Thus, suggesting a change in CASE tools is now feasible. 3.2 The MOSKitt concept This led the CIT to carry out several feasibility studies to determine which tool (preferably an open source tool or one which at least worked on a Linux distribution) it should migrate to. During this feasibility study phase, the open source CASE tools available at the time were analysed using a list of requirements put together by the CIT. The first study showed that there were no open source CASE tools which minimally met CIT requirements. The feasibility study phase included a project between the OO-Method group from the Computer Systems Department at the Universidad Politécnica de Valencia and the CIT to draw up a study of possible solutions for the development of an open source CASE tool which would support the gvmétrica methodology. MOSKitt provides the CIT, and organisations in general, with tools that allow them to apply software engineering in their development processes The first conclusion drawn from this study was that it was possible to use an existing platform to develop the project instead of starting from scratch or adapting an open source CASE tool. The platform was Eclipse including the plug-ins proposed by the EMP project (the Eclipse Modelling Project which standardises modelling-related plug-ins directly developed in the Eclipse project). The final document in the feasibility study phase was the gvcase project s technical specifications. In May 2007, the gvcase project was started. Its objective was to build MOSKitt, an open source CASE tool based on Eclipse to support the gvmétrica methodology. The gvcase project was also integrated in gvpontis. The initial objective of the project has evolved and is now the leading open source modelling platform for building CASE tools which support the development process set out by different organisations, i.e. it makes the tool versatile enough to be able to support the application of methodologies used in different types of information systems. This project has an important research and innovation component. MOSKitt provides the CIT, and organisations in general, with tools that allow them to apply software engineering in their development processes as per the paradigm of Model-driven Software Development (MDSD), which gives models the main role in a process, compared to traditional developments based on programming languages and object- and component-based software platforms by: dd Selecting the techniques (UML2, BPMN, etc.) that best suit the suit the type of systems which need to be built in each organisation. dd Defining and providing support for software development pro- cesses to apply the development method that best suits the organisation and the information systems to be developed. For further project information, go to & php?id=gvmetrica. 3.3 Future lines work Future lines of work in both projects include: dd Finish adapting gvmétrica to include all the processes and interfaces. The project management interface has now been 20
22 gvmétrica and MOSKitt: definition of a development and support methodology 3 adapted. The next task is to present it to the CIT Project Managers so it can be implemented and monitored by the gvmétrica group. The test plan has already begun. dd Preparing the migration of the PowerDesigner gvmétrica process to MOSKitt. dd Providing the MOSKitt tool with the capacity to automatically produce totally functional software systems based on models which meet the CIT s requirements. dd In line with the philosophy of open source software projects, MOSKitt aims to build, propagate and provide support to an interactive, collaborative community to widen the range of the project and improve the results obtained so that it adapts to the natural evolution of organisational needs in the field of software engineering. dd Adapting the methodology and MOSKitt to applications which do not follow the classical pattern of management applications and in short to all types of applications. 21
23 CHAPTER 4 gvhidra: development of a framework for PHP The basic idea of the project was to create a working framework based on a model-view-controller (MVC) and implemented in PHP to create management applications in web environments and which could integrate the CIT s Style Guide. The gvhidra environment had a series of initial objectives: 1. The tool had to facilitate the migration of small and medium-sized client/server-applications developed using MS- Access and/or PowerBuilder. 2. It had to move programmers away (as much as was possible) from the ins and outs of HTML and Javascript so that an MS-Access and/or PowerBuilder programmer could start to develop applications with basic PHP knowledge, by imitating the development method used with Power- Builder and MS-Access in which the leading role was taken by SQL (business process-oriented applications). A multitude of GPL tools were used to achieve these objectives. They included Eclipse, Apache, CVS, etc. and in line with the open source spirit, others have been included in the project such as Phrame, Smarty and PEAR. 4.1 gvhidra s evolution Achieving this initial objective was hard work. The project was based on an external development which had many shortcomings. We were hardly able to take advantage of anything. We had to learn to look rather than to do. In the open source world there is always somebody who has tried to do something similar and at worst, you can always learn from his/her experience. This has taught us to look at the state-of-the-art before tackling new tasks. When we achieved our initial objectives, we could judge their advantages and disadvantages. Some of the factors included in gvhidra as an advantage also limited the tool s flexibility, for example: dd Moving the programmers away from the difficulties of HTML and Javascript meant that they were NOT able to use them in cases the framework did not envisage. This forced the gvhidra team to constantly include new functions in the framework. We may have been too strict. Something that does fewer things requires greater effort on the programmer s behalf but also offers more possibilities. We had to find a happy medium for programmers between flexibility (scaleability, adaptability...) and user-friendliness (utilities, pattern 22
24 gvhidra: development of a framework for PHP 4 paradigms...). The framework development team s time is currently taken up with maintaining, developing and building applications on the framework. The team s dual profile gives us a closer look at the framework s problems but means it takes longer to introduce improvements and solutions. We had to learn to look rather than to do. In the open source world there is always somebody who has tried to do something similar and at worst, you can always learn from his/her experience. This has taught us to look at the state-of-the-art before tackling new tasks dd Implementing the Style Guide caused two problems. The Style Guide s appearance restricted the framework s audience to CIT staff and wasted the external potential that all open source projects should have. Moreover, the Style Guide simply regulated the appearance of the programme. There was no explanation of HOW it should work or how it should relate to the user. We began to overcome this hurdle in conjunction with the gvmétrica team. dd The SQL and the PowerBuilder and MS-Access structured analysis and design paradigm became obsolete. It became evident that different software architecture models would be required: service-oriented analysis, service orchestration and a long etcetera based on object orientation during the software s entire life cycle. This means we must coordinate our efforts with projects such as gvmétrica and MOSKitt. dd Migration from an existing application is not simply analys- ing how it works and giving it a new interface via the browser. It implies reviewing the application with the user who always makes the most of the opportunity to add new functions. In the majority of cases, these reviews have not gone hand in hand with analyses (or the analyses have been informal and there has been no written record of them). gvmétrica has once again been used to overcome this hurdle. The gvhidra road map always includes making programmers work easier and adding functions to the framework. This constant evolution is based on the new internal needs that arise out of the different applications. This includes new applications and the maintenance of existing applications. An eye is also kept on the latest trends and alternatives in the outside world as another source of improvements. 23
25 CHAPTER 5 Implementation of version control systems: CVS and Subversion One of the most important functions required when several programmers coincide in the development of an application is having a version control system which uses a common repository to store the programmed files and compare changes that have been made in the different versions. Initially, we decided to use the mature CVS instead of Subversion as a version control tool. This was a significant advance on the previous procedure which used a copy on the server to share the files from an application under development and manually control the latest version. The only hurdle to be overcome was training the developers to use the tool. At a later point, the need arose to include more sophisticated, more user-friendly control to some applications under development than the system used by CVS. Thus, we had to migrate the whole repository to a different version of the tool. The Subversion tool was considered as by then it was mature enough to control the application access we required. However, as not all the applications required this type of user control and the applications that did need it were all new, we decided not to migrate the whole project repository as initially anticipated. Instead the applications which did not require this type of user access control were maintained in the CVS repository and the applications which did require this type of access control were placed in the Subversion repository. On this occasion, there was less need to train as Subversion is similar to CVS in its use. Although the tool has different operations, the tasks carried out by the user are very similar in the daily use of either of the two. We still think that it is not worth migrating all the applications to Subversion unless determining factors urge us to do otherwise. Thus, we can say that the current situation will continue in the near future. From our experience in the use of both tools, we highly recommend the use of either of these two version control tools (CVS and Subversion are the most widely used tools in the world of open source software) to share source code among programmers especially when they are in different locations. The benefits obtained from using the tool exceed by far the learning curve required to learn how to use it. 24
26 Implementation of a report generation tool CHAPTER 6 The ireport tool (which includes the JasperReport report engine) is a report design tool which replaces pay tools which are included in the PowerBuilder and Oracle Developer programming environments. The feasibility study that was carried out before adopting this tool demonstrated its advantages and disadvantages. The advantages of using this tool include: d d Multiplatform. Because it is written in Java. dd Availability of a report generation environment. Which is very similar to professional commercial tools. d d Functions. The tool has no shortcomings compared to a commercial tool. Some particularly salient functions: it has an excellent command of pivot tables, subreports, a large range of graphs and the option to generate labels and bar codes. Reports can be parametrised and customised in many ways and can also be modified and created in run time. dd Project maturity. This project has several years of experience behind it and there have been many versions. This means it has very few errors and it is easy to find documents of examples and experiences of other organisations. Spanish and Valencian companies offer training in the tool and work dd with it on a regular basis and we can thus make the most of their knowledge. Adaptability. The graphical environment (ireport) and the library that actually does the work (JasperReports) are completely separate. This means that other graphical environments can be chosen and above all ensures that the programming interface which creates reports in run time is a complete library with exactly the same possibilities as ireport. It should be remembered that ireport does not generate reports on its own, it delegates all the work in the Jasper- Reports library. d d Export file formats. One of the main advantages of Jasper- Reports is the number of formats its reports can be exported to. These include open source and commercial formats. The main disadvantages are: d d Conversions. It cannot read a file which has the design of a report created using a different report generation tool and automatically convert it to a JasperReports design. d d Translation. There is an open source translation project which people selflessly work on by translating the programme. In the case of the Spanish language, the translation 25
27 Complete migration to open source software is aimed at the Latin American-speaking population and currently centres on the programmes. The majority of the written documentation generated by JasperSoft is only in English. d d Compilations. As it is an open source project which uses other projects, if a source code which comes from another project has to be modified and compiled, the user has to look for the documentation belonging to other projects because JasperSoft does not offer full documentation of the whole set, i.e. it does not include other products it takes advantage of. When the user wishes to change the configurations or modify something from other products, it refers him/her to these other products. This complicates the full compilation of all the source codes if this was necessary, since each subproject s compilation is documented on its own web site. Fortunately, JasperSoft already offers compiled versions which contain the compilations of all the subprojects. After analysing the advantages and disadvantages of the application, we decided to adopt it and implement it. This process is still in its early stages. 6.1 Implementing the tool As existing reports cannot automatically be converted to the JasperReports format, and bearing in mind there are no limits to the number of report generation tools that can co-exist in one computer, initially it is not necessary to establish a plan to gradually convert existing reports. Any new reports required and existing reports whose design needs to be modified substantially will be generated in JasperReports. For when the migration of the reports created using Oracle Developer is planned, JasperReports includes another external open source tool which runs Oracle PL/SQL source code as though it were standard SQL source code. This makes it easy to reuse all the PL/SQL source code contained in the Oracle Developer reports. This tool has not yet been tried because we have not needed to migrate any of these reports. JasperReports has been well accepted by our developers when they have seen the number of additional tools available for JasperReports such as JasperServer, JasperAnalysis and JasperETL, which include report design functions that did not exist in the tools available at the time in the CIT. Likewise, another of the things that has encouraged the introduction of JasperReports has been the good graphic support it has. The end users are particularly happy with this support and are asking to be given the largest amount of graphical information possible. 6.2 Future lines of work The JasperReports tool offers other highly interesting solutions that we have tried and have a good chance of being implemented: dd As existing reports cannot automatically be converted to the JasperReports format, and bearing in mind there are no limits to the number of report generation tools that can coexist in one computer, initially it is not necessary to establish a plan to gradually convert JasperServer. A tool for publishing reports which admits control via permissions at several levels (modification of the report design, launch of a report to obtain updated information, control of who can access the reports obtained as a result, etc.) and allows any programme to invoke it using open source standards, such as web services, or via a graphic user interface (this means the report does not have to be accessible from any specific application). It also allows reports to run automatically at specific times so that at particular times when computers are underused, they can automatically collect reports of all the incidents that occur. d d JaperAnalysis. Very powerful tool to carry out Data Warehouse. d d JasperETL. A tool to automate the data migrations, etc. which automatically collects reports of all the incidents that occur. Also allows the information to be converted from/to very different formats. 26
28 Implementation of a report generation tool 6 >> Recommendations dd We do not recommend using the ireport tool on an exclusive basis. To make the most of JasperReports functions, the JasperReports library can be programmed directly. Many functions have been documented using the jdk tool. This documentation is distributed for free and is available to the general public on the Internet. dd dd Publicise the tool s graph generation capabilities and its other tools, such as JasperAnalysis for Datawarehouse, to generate interest in them. Study the examples that come with ireport and JasperReport. The JasperReports web offers a version of the tool which includes other examples that can be also modified using the ireport tool. dd Consult JasperSoft s own forum when a query over programming comes up. 27
29 CHAPTER 7 Migration of the web portal and intranet When the gvpontis project started, the CIT had one corporate portal, which was available to the general public and another corporate portal for internal use. The gvsig tool, the CIT s open source software application, also began to be noticed at this time. This project also has an official web site, The different corporate portals were designed at the start of This meant that both new needs and advances in technology required some modifications to be made to adapt them to the new demands that had arisen during this period. Midway through 2005 we decided to redesign and restructure the information on the web site (internet) and its intranet. The work to be carried out by the web team can be summarised in two parts. One part consisted of updating and maintaining the web contents on both corporate portals. The web content was made up of static pages and html pages grouped into directories based on the guidelines followed at the time. The other part involved portal maintenance, such as reviewing broken links and making backups of the different file structures, etc. Other tasks included the generation of statistics and image manipulation for publication on the web. The technology used before the migration project was started was an open source web page server, which ran on a UNIX operating system. Corporate web portals were optimised to be consulted using version 4.0 or superior browsers and a resolution of 800 x 600. Visibility was ensured on what were the two major browsers at the time, Internet Explorer and Netscape-Mozilla. Apart from the web page server and the machine s operating system, the web team used commercial software for the graphic design tools, a web page editing programme and a form design tool in pdf format, among others. The choice of which open source programme was to be used for the different components on the new platform was characterised either by the interaction with other departments in the computer area, such as the Systems Department, Data Base Department and the Development Department, or by the web team s own needs or requirements when deploying the applications it required to work. Once the Systems Department had evaluated the necessary requirements, Suse Sless (Linux server) and the Apache web page server were chosen. Typo3 was chosen as the content manager. Typo3 is an open source content management system (CMS) developed under GPL. At the time it was thought to be the programme which best suited the working team s needs. Some of the characteristics which appeared to be most relevant after its analysis included: dd Its efficient content management and creation of simplified workflows. 28
30 Migration of the web portal and intranet 7 dd Its flexibility as it adapted to different needs and distributions. dd Its speed and user-friendliness through its practical and in- tuitive interface. dd It was multilingual and was totally integrated in over 16 lan- guages. dd Its professional quality results. The technology TYPO3 uses is a package commonly known as LAMP (Linux, Apache, MySQL and PHP). The acronym LAMP refers to a set of software subsystems needed to reach a global solution, in this case to configure the web portals. Subsystem-wise, Linux is used as the operating system, Apache as the web server, MySQL to manage the data bases and Perl, PHP or Python as the programming languages. In our case we used PHP. TYPO3 is run on a browser so it does not require any special software on the part of the user. Any current browser with graphic support is valid. During the course of the gvpontis project, the corporate web portals, as was the case of other Computer Department projects, were also migrated to open source software. The aim was to try to use open source software tools as much as possible to develop the corporate web portals in our day to day work. These tools included the data base, the web page server, the interpreted programming language for the creation of dynamic web pages and the web content management system (CMS). Although the data base team ruled out the use of the MySQL 1 data base at the start of the gvpontis project, we later included it because it guaranteed better performance from TYPO3. We currently work with different versions for each of the tools. This is due to the time difference from 2005 to the present day in the implementation of the different portals currently run by the CIT. 1 Consult conclusions report. The first portals we set up using LAMP + TYPO3 were es, and They currently run on the following versions: TYPO , Apache , PHP 4 and MySQL 4.1. The and gva.es web portals were developed later and use a more recent version of TYPO3 but still have the same LAMP base configuration. The most recent portals are and gva.es. The latter has the same base configuration for LAMP as the previously mentioned portals but uses a newer, more stable version of TYPO3 (version 4.1.5). The web portal is different from the other portals because of the versions it uses for some of the tools. In this portal, the same version of TYPO3 is used as in ideacv but the LAMP packet differs in the product versions. Version 5 was installed for MySQL and PHP and the NGINX open source product was configured as the web server. Although the data base team ruled out the use of the MySQL data base at the start of the gvpontis project, we later included it because it guaranteed better performance from TYPO3 The combination of tools we have arrived at and which we currently work with has given us greater flexibility to maintain and update our current web portals and develop new ones. We design the webs to be compatible with different browser versions, mainly Firefox and Internet Explorer. We also take Mozilla into account. The Internet Explorer for Linux is a beta version. It works well but not as well as in Windows. We thus need an Internet Explorer run on Windows to check browsability and the appearance of the web pages. On the graphic design front, we have the open source GIMP programme which is a good candidate to replace Adobe Photoshop. The disadvantage we have found is that the programme has a high learning curve and we have not obtained the same results as with Adobe Photoshop. 29
31 Complete migration to open source software One of the tasks involved in web portal maintenance is the generation of statistics. The open source software tool chosen was Awstats but it does not have the same potential as the Windows-based tool. We have had the same problem with the tool for reviewing broken links. We have not found an open source software tool that works as well as the Linkbot commercial tool. In conclusion, at this point in time we basically use open source software to carry out all our day to day work, with the exception of specific functions such as the generation of fill-in forms (Adobe Acrobat Writer) and advanced image editing (Adobe Photoshop). We also need to have a Windows platform to check browsability and the appearance of web pages with Internet Explorer. 30
32 Workflow for business process management CHAPTER 8 The workflow or BPMS tool used in the CIT is MASTIN. This is also the workflow standard for the Valencian Regional Government. The transition to open source software involved adapting this application which at the start of the migration project was based on a technological environment made up of: look at the first phase: the migration to the Linux operating system, Firefox browser and OpenOffice office technology package. The most important aspects dealt with before and during migration were: dd Before migration: d d Oracle as the DBMS. dd Developer as the development tool. dd Client operating system: Windows. dd Web browser: Internet Explorer. dd Office technology packages: MSOffice 97 (Word and Excel).»»»»»» Tests, detection and solution of errors in the application (Workflow) in the different versions of Linux installed at the CIT. Study of the different OpenOffice versions existing at the time to find out how to use them and see what they could and could not do. Tests, detection and solution of the problems occurring when using OpenOffice. During the course of the project, this totally proprietary technological environment evolved in two completely different phases: 1. Migration to the Linux operating system, Firefox browser and OpenOffice office technology package. dd The actual migration affected the following:»» The Workflow core and the programming of certain aspects of the application which only worked with Windows were modified so that they would work with Linux. 2. Migration to PostgreSQL as the DBMS and Java as the development tool. The first phase finished successfully whilst the migration to Java, which is independent from the data base, is still underway. Let us take a closer»»»» Although the application could be run on Linux, Windows had to be used for forms and lists developed in Developer. The CIT gradually converted Word documents to OpenOffice as per the change to Linux in the departments in which it 31
33 Complete migration to open source software was progressively installed. When this process is finished, the application can be run entirely on Linux. dd The use of OpenOffice to generate output documents (propos- als, resolutions and notifications) from Workflow to substitute MSOffice was very time-consuming as a result of the following:»» All the Word document queries that included conditions had to be modified. New data base functions had to be created to obtain the data that needed to be shown in the document. This had to be done because OpenOffice cannot process conditions. This meant it took longer to create and modify documents for MASTIN. dd In some specific areas, the problem with using Linux is that what works in a Linux distribution does not always work in another distribution, i.e. the bar codes used by payment documents work in Xaloc but do not work in Lliurex. Now let us take a closer look at the second phase: The evolution to a version that uses PostgreSQL as the DBMS and Java as the development tool. This version is currently being implemented. Before we started, we reengineered the organisational processes used in process management and those which MASTIN supports in order to maximise task efficiency once the tool has been implemented. dd The benefits obtained from this new version are:»»»»»» The formulas in Word documents had to be recalculated because OpenOffice could not interpret some of them. Problems with OpenOffice version compatibility: The documents created with one version of OpenOffice cannot be generated with another version installed in the user s PC without all these documents being modified. It takes longer to design and create OpenOffice documents because it is more complicated to draw and insert all the required elements.»» It is technologically up to date.»» There is no need for licences. dd To achieve this the following problems must be solved:»» Computer staff must be retrained to use the new technologies/tools in the new application.»» Users must be trained to use the technology in the new application. 32
34 gvadoc: document management system CHAPTER 9 Within the plan to migrate to open source software, gvadoc (document file application) became the first migration experience from a commercial application. The planning, coordination and monitoring phases of the migration process were completed in this experience. The CIT had been using document management applications since 1998, with the servers and systems to physically store the images (saved in file systems or in data bases) evolving over time. gvadoc has become the third document management application after previous experiences with Keonview and Poseidoc. gvadoc allows document image storage by independent document areas, with a connection to the alphanumeric applications which manage each of the processes dealt with by the CIT. The application s aim is to allow fast, easy access to the documentation contained in each process, whether it has been generated by the CIT or has come from outside, and without the need to go to the physical file which may be located in another building. Likewise, digital documents are easy to see in queries on the internet, by the companies and/or citizens involved in the process, as long as the preset security conditions are met. The application makes it easy to define and set up new document areas in the CIT s other departments quickly and with little effort from the computer staff. gvadoc is tailor-made to our needs as it uses the necessary functions from the previous application and includes new functions suggested by the users. These functions were defined after all the CIT users who worked with the previous application were given a survey. We decided to make an important change in technological environment and thus eliminate our dependence on proprietary products on the market. This meant that the application needed to be fully maintained by the CIT. A summary of the change in technological environment is shown below, for comparative purposes: Previous application Windows client/server Oracle C-Basic Own interface Within the plan to migrate to open source software, gvadoc (document file application) became the first migration experience from a commercial application ADOC Multiplatform 3 layers PostgreSQL Java CIT s Style Guide 33
35 Complete migration to open source software It uses TOMCAT as the application server. It was originally implemented in two of the CIT departments, but now it is used in seven of them. Each of these areas has its own particularities, yet one of the main advantages of the tool is its versatility. The organisation of digital wardrobes and the characteristics of document specifications are defined according to the needs of each department. The volume of digital information currently takes up 114 GB of memory in a variety of different formats, such as tif, pdf, doc and odt. The documents can be accessed in several different ways: dd Through the corresponding departmental process monito- ring application which brings up the required document on the screen swiftly and directly. dd Directly via the gvadoc application. dd Via an internet browser, where a member of the public can, for example, see the information about an administrative con- tract under tender (conditions for tender,...) or information about a specific expropriation in real time. The major problem which had to be tackled was located in the scanner interfaces. To obtain a multiplatform/gpl application, we decided on a hardware solution (AXIS box) to scan the images. This more expensive solution beat off other Linux and GPL-based software solutions which have been ruled out at present because of the lack of updated scanner drivers. Another difficulty was convincing the firm that created the application to publish its own libraries used in the development of gvadoc under the GPL. On the positive side, we should mention that the PostgreSQL DBMS has amply solved the data base needs. New departments are gradually adding their files to the system. Likewise, the tool is expected to evolve with new functions, such as the inclusion of digital signature processes and the improvement of external accesses through Web Services. 34
36
37 Part 2
38 Operating systems and communications The Systems and Communications Department have evaluated, studied and implemented different tools in the following areas: 1. End user PC environment:»»»» Operating system and graphic desktop in end user PC environment. Office technology in end user PC environment: word processor, spreadsheets, client, web browser, etc. 2. Local network server environment which provides services such as printing, file sharing and user authentication. 3. Networking and communication environment. 4. Corporate server environment for data bases, application servers and web services. The work carried out in each area under the gvpontis framework is detailed below. 37
39 CHAPTER 10 End user PC environment When we began the study in January 2004, we were faced with the following: A set of about 1000 PCs with Windows 98, Outlook Express client, MS-Office for word processing and spreadsheets, instant messaging, Panda antivirus and remote computer management with two proprietary tools: LiveHelp and Tivoli. The user PC working environment was one of the most sensitive areas of the migration because: dd The applications were client/server, i.e. they ran on the PCs. dd Windows was the only environment the users were familiar with. Since the tools the applications were developed on did not have modules to run on Linux, we had to find a short term technical solution which allowed us to continue with the migration on the users workstations. We decided on virtualisation because it offered us the following advantages: dd We could implement Linux in the users PCs without having to wait until the application migration was complete. dd It provided us with a way of interacting with other bodies in our milieu which used Windows environment tools. Different distributions were considered to carry out this option and were all thought to be suitable. Firstly, we chose the desktop appearance and the different software tools that replaced the ones we used under Windows. The platform was validated and we got ready to implement it. The distribution was SuSE Linux 9.0 with a KDE desktop, OpenOffice, Mozilla to surf the internet and for and a Win4Lin emulator with a full Win98. This platform was implemented with a small number of users (ten to be precise). These users were trained beforehand, they were in favour of change and we gave them individual technical help which allowed them to resolve all their user queries as well as unforeseen errors immediately. Afterwards, and once we had checked that the project was viable, the trend in Linux distributions made us think that we could personalise our own distribution. Thus Xaloc arose. Xaloc was a package management platform and a hardware detection tool which made it easy for us to manage the set of PCs because we could make massive, unattended installations. As the number of PCs increased we realised that we were no longer capable of creating new stable versions of Xaloc which could keep up with the speed of changes in hardware (for example, from SDI disks to SATA disks), 38
40 End user PC environment 10 of new versions of the tools (for example, from Mozilla to Firefox and Thunderbird) or the demand for new functions which had to be catered for with our own distribution. We decided to abandon this line of work and turn to another project which had specialist staff who could generate the distribution on a full-time basis. Thankfully, the Lliurex project, which suited our needs, was coming to maturity in the Valencian Regional Government. Although this project was very education-centred, it was reasonably suited to our needs and allowed us to add our needs to its distribution. Thus a plan to implement LliureX in all the CIT s PCs was designed. This implementation will finish in December Version 7.11 of Lliurex is currently being installed. This distribution is based on UBUNTU Feisty. It includes the OpenOffice office technology package, the Firefox browser and the Thunderbird client. As it is aimed at the educational environment its packages tend to focus on this area and we have therefore been forced to add the following packages: dd Kprinter. dd Java 1.5. dd Adobe Acrobat Reader. dd VMware emulator. This set of packages was used to create an image for configuring the CIT s computers. Several distributions co-exist in the CIT at present as a result of the excessive number of changes. These are SUSE 9.0, SUSE 10 and Xaloc. These distributions are maintained in the users PCs to save time and are not migrated to Lliurex as long as they do not create any problems. If any changes have to be made to a PC which is using one of these distributions, we take advantage of this circumstance to migrate it to Lliurex. VNC, an open source software programme based on a client/server structure, is used for remote computer control. This allows us to take total remote control of any computer-server through a computer-client. This is also called remote desktop software. VNC allows a different operating system in each computer: the screen of a computer with a particular operating system can be shared by connecting to it from any other computer or device that has a VNC client. This is especially useful in our case because of the coexistence of different operating systems and different distributions on each PC. Thankfully, the Lliurex project, which suited our needs, was coming to maturity in the Valencian Regional Government. Although this project was very education-centred, it was reasonably suited to our needs and allowed us to add our needs to its distribution. Thus a plan to implement LliureX in all the CIT s PCs was designed. This implementation will finish in December 2008 We decided to use the VMware software to solve the problem of the applications that only run under Windows. This consists of a software virtualisation system, i.e. a programme which simulates a physical system with specific hardware features. When the emulator programme is run it provides a working environment which is similar in all intents and purposes to a physical computer, except for the pure physical access to the emulated hardware. VMware runs virtual machines created using other VMware products but does not allow itself to create them. Virtual machines can be created with more advanced products such as VMware Workstation. 39
41 CHAPTER 11 Local network server environment The decision to migrate to Linux as a general purpose operating system also affected the structure and assembly of the local network corporate servers, both at head office and at all the other offices. What we wanted to do was be able to dispense with the Windows 2000 PCs that offered the following services: dd Printing. dd User validation. dd File sharing. The decision we took was for Linux clients to use the services provided by the newly installed Linux servers and for the Windows clients to do the same with the existing Windows servers, i.e. have parallel infrastructures in both environments. This forced us to maintain a service structure for the Windows PCs until they disappeared from the network and create another equivalent infrastructure for the Linux services which had more and more clients as the PC migration process continued. We took this decision after checking that the integration of the Windows9x PCs in the new Linux server infrastructure was not going to work because of protocol incompatibility. Moreover, we were not in favour of the Linux clients continuing to use the services provided by Windows 2000 servers as this went against the objective of migrating all the software we used to the open source world. The disadvantage of this strategy was that it duplicated the work of the administrator yet it meant we did not have to keep up with the migration speed of other environments and ensured users received a service whatever their operating system. This was important as it was a basic objective of the migration project. Once the users were using the SuSE Linux v.9.0 distribution as a client operating system, a server which ran under SLES Linux v.8.0 was set up as a shared file and directory server via Samba. We also created a directory in each user PC as a shared resource so that other Windows and Linux users could use it. In the field of local network research the existing technical alternatives in the open source software world for the services that we needed were crystal clear in some cases and particularly tenuous in others. However, an obvious disadvantage was that open source software solutions were not integrated in a sole graphic interface which made it easy to administer resources. Nonetheless, there were clear, solid alternatives in almost all the cases. We started off with a PDC (primary domain controller) server network using the Windows 2000 SP4 operating system (a total of 14 servers). These were file and printing network servers in which headquarters and regional office users validated the permissions security and the network session to be able to access the shared resources and printers. 40
42 Local network server environment 11 Clients started sessions under Windows 98 SE, Windows 2000 workstation and Windows XP SP1/SP2, obtaining the necessary permissions from the primary domain controllers (PDC) and backup domain controllers (BDC), while the data bases of the users, groups and PCs in the Windows environment were maintained in Active Directory. Accesses to the network structure were announced by the Windows 2000 SP4 PDCs and BDCs with their domain name system (DNS), which assigned the names of the PCs in our network to the CIT s different domains. Security was provided by Microsoft s LDAP structure (i.e. Active Directory). We chose OpenLDAP for user authentication in the local network. In this case, we had a double objective. The trend in new n-layer applications was also to use authentication against LDAP services, so we could integrate both user repositories. The services included in this new infrastructure were: were stored in this server and this is how Linux users were validated in the network (approximately 500 out of a total of 900). As we advanced in this project, our objective became having a common interface for user provisioning. We looked for a sole way to administer the users, i.e. for the Windows and Linux users to be the same irrespective of the PC they used to connect to the network. To achieve this objective, we started an identity management project using Sun s Identity Manager as we did not find any equivalent open source software products. We have made great efforts on the technical and organisational front to integrate our user repositories in the CIT: Active Directory, OpenLDAP and Oracle. Right now we have a sole central point for user provisioning and the Identity Manager maintains the repositories synchronised. The next step in this environment will be to integrate other user repositories within the Valencian Regional Government, such as accounts and the users of other applications from other regional ministries LDAP 11.2 CUPS 1 At this point in the migration process, once we had chosen OpenLDAP, an identity management project was started which aimed to unify the Windows Active Directory user repository, the user authentication repository in the Linux network, the data base user repository for old applications and the application user repository for new applications and portals into one sole metadirectory. We were aware of the difficulty in controlling the authentication of users, rights and access restrictions, account profiles, passwords and other attributes required to administer user profiles. In 2004, finding the right solution was extremely difficult as there were no mature products available and manufacturers had no experience in identity management projects. The LDAP Linux server is an OpenLDAP2 which is responsible for maintaining the data base with all the users, user groups and PCs in our Linux network. All the necessary data for each user and PC A CUPS server is used for printing. This system can manage the remote printers in a network. This print service uses the IPP protocol which allows printers configured in this service to be used by Linux clients. CUPS has a user-friendly web interface which allows the printers installed in the CUPS servers to be managed, allows users to see all the print jobs in each printer, change quotas and permissions, modify devices, manage print queues and jobs, add and configure new printers, etc. from any point in our intranet. The only problem we have found is the lack of drivers for some very old printers. However, in the end these problems have been solved 1 CUPS is the acronym for Common Unix Printing System. This system allows a mainframe computer to work as a print server and thus centralise printing and list management in a local network environment. It uses the IPP (Internet Printing Protocol) as the basis for print jobs and queues and it is distributed under the GNU General Public License. 41
43 by installing devices to emulate HP printers in GIMP + CUPS mode or by installing generic drivers (Postscript style). This problem has decreased over time as new printers all have Linux drivers BIND The LDAP Linux server is an OpenLDAP2 which is responsible for maintaining the data base with all the users, user groups and PCs in our Linux network 11.4 SAMBA We used SAMBA 2 for file sharing as it had been amply tried and tested in the CIT. SAMBA is still used, although our intention is to get rid of this product, detect and analyse knowledge-sharing needs and design a strategy to do away with the to-ing and fro-ing of files between PCs. Our objective is to prevent data loss from not having a backup copy, prevent the dispersion of personal data in PCs without the knowledge of security heads, etc. We aim to find a collaborative tool together with a portal so that users can exchange files and data but within an administrator-controlled environment. Bind was chosen as the name service as it had been used beforehand in the CIT so that the computers could be located via the internet and in the Valencian Regional Government s corporate network. 2 SAMBA is an implementation of a set of protocols and services over TCP/ IP belonging to Microsoft which constitute what is called NetBIOS or SMB. It allows a Linux computer to behave like a Windows client and/or server. 42
44 Networking and communication environment CHAPTER 12 Our communication network offers 24/7 service to the whole CIT, perimeter security based on firewalls and proxy-based internet access management. Our objective was to continue to maintain this situation during and after the migration to open environments. Due to the multiplatform nature of the TCP/IP protocol used in the CIT we had no difficulties as far as network communications were concerned in making the services required for the migration work on the existing network. Thus, no changes were made to the WAN and LAN physical network. The perimeter security service was based on a firewall made up of SPARC processors on a Solaris operating system. The firewall s software and hardware were both updated and moved to a platform based on Intel processors and a Linux operating system. We gained on redundancy and performance with this Linux cluster. No problems arose. In fact, performance improved considerably, although the hardware improvement also helped. The proxy service we employed for users to go onto the internet and access some applications via the internet already belonged to the world of open source software. Squid, the proxy software, is licensed under the GNU General Public Licence and since it was installed (in 1999) it has always run on an Intel/Linux platform. Thus we have had no difficulty in updating it. The -Intel- hardware and the Squid version were both updated. Finally, the management and monitoring tools (MRTG and Nagios) we used before the migration began were also licensed under GNU/ GPL and ran on Linux platforms. Thus, we had no difficulty in updating versions. We gained on redundancy and performance with this Linux cluster. No problems arose. In fact, performance improved considerably, although the hardware improvement also helped In short, considering that Linux applications and environments were already used before the migration project was started, we cannot talk about migration as such, but rather about a normal process of updating and improving these environments, with the exception of the firewall. In this case, we took advantage of the fact that the hardware needed to be updated to migrate from a Solaris environment to a Linux-based environment. 43
45 CHAPTER 13 Corporate server environment Corporate servers were computers with a Unix operating system and SPARC processors. We began to replace them with PCs with a Linux operating system and Intel processors when they became obsolete. In this area, the importance of the change was moving from a large capacity mainframe-type computer mentality, which catered for all the CIT s process needs, to Intel-architecture machines with another more evenly distributed philosophy aimed at open environments. This is to say, the number of servers needed to provide services to users increased several times over for the following reasons: dd The servers increased several times over in the local network environment to maintain the Windows and Linux environment separate (V. Local Network Server Environment section 11). dd The servers also doubled in the data base environment becau- se PCs with PostgreSQL and MySQL had to be installed both for the production environments and for the development and preproduction environments (V. gvdades section 2). dd Since the applications migrated from a client-server model to a three-layer model, new PCs were also installed to run the application servers in the development, production and preproduction environments. These application servers were Apache+PHP for the gvhidra developments and Tomcat for Java applications. Jboss arrived afterwards for more complex applications. dd A series of PCs was installed with different work organisa- tion services at the heart of the computer department. These services included CVS, Subversion, Gforge and Zope. The difficulty in this area was taking on the administration of all these new tools, in addition to administering the operating systems on all the machines that hosted them. As a result of this increase in the number of servers, the backup system in the CIT had to be redesigned. We looked for a solution whose server would run on Linux but which was heterogeneous enough to allow us to create backups of the new Linux servers and the existing Windows servers. The solution chosen was the backup software LEGATO whose server part ran on an Intel machine with a RedHat distribution. Storage was catered for by installing a Storagetek L20 tape library with a theoretical storage capacity of 4 terabytes. This was later replaced by Storagetek s L40 model which had a theoretical storage capacity of 8 terabytes for backups. We are currently increasing our logical backups with a solution based on Intel PCs, a RedHat operating system and Bacula software which has a GPL licence. 44
46 Corporate server environment 13 We looked for a solution whose server would run on Linux but which was heterogeneous enough to allow us to create backups of the new Linux servers and the existing Windows servers Luckily, the market has changed and manufacturers have introduced more and more powerful machines onto the market with more and more facilities to implement high availability, clusters and load balancing in Intel and Linux environments. 45
47 Part 3
48 GIS and CAD: gvsig The GIS/CAD area is responsible for carrying out the migration of the tools known as Geographic Information Systems (GIS) and Computer-Aided Design (CAD). 47
49 CHAPTER 14 gvsig: introduction CAD programmes are applications for precise vector editing. Typical users include engineers, architects and other design professionals. These programmes manage a data base of geometric elements (points, lines, arches, etc.) which can be operated via a graphic display these features appear on, i.e. the so-called drawing editor. User interaction comes via editing or drawing commands or through a graphic user interface which automates the process. Although the majority of CAD programmes have a wide range of uses, in the case of the CIT they are mainly used for editing maps before being used by GIS applications. GIS programmes analyse, manage, handle, store and capture geographically referenced information, i.e. spatially related to land management. A GIS programme works as a data base with geographic information (alphanumeric data) which is associated by a common ID to the graphic objects in a digital map. Thus, an object s attributes are discovered by pointing at it and inversely, asking the data base for a record gives a point s location on a map. The basic reason for using a GIS is to manage spatial data. The system separates the data in different subject layers and stores them independently so they can be accessed quickly and easily. It also allows users to relate existing data through the objects topology in order to generate a new layer that could not otherwise be obtained. In the case of the CIT, which has direct land management competences, such as public works, transport, ports, airports, energy, architecture, architectural and urban heritage, educational, health and cultural facilities, and land and coasts, GIS become a fundamental tool. Thus, as far as the CIT is concerned, GIS and CAD technologies are directly related to our workflows and are not independent from them. GIS programmes analyse, manage, handle, store and capture geographically referenced information, i.e. spatially related to land management 48
50 gvsig: description and justification of our initial situation CHAPTER 15 The first step towards carrying out the migration plan in the GIS/CAD area required a previous study of the real situation in the CIT. We needed to find out how many users employed these technologies, what computer applications they used, which areas of these applications they actually used, etc. We thus carried out a survey which yielded the following results. dd There were 90 GIS/CAD technology users. dd On the CAD front, different versions of AutoCAD and Micro- Station were used. dd On the GIS front, different versions of the ESRI family of products were mainly used. dd Vector data was used. The use of raster data was virtually non-existent (except as visual information). dd Over 90% of these users actually employed a tiny part of the functions offered by these programmes. Only a very small number of users carried out advanced functions. dd In the case of CAD, they were used for advanced map edit- ing before building the topology which actually belongs to GIS. dd A wide variety of formats was used in both CAD and GIS which made it difficult for CIT staff to exchange information amongst themselves and with the external world. Files were worked with via local networks and neither standards nor spatial data bases were used. Once this analysis had been carried out in the CIT, we had to find out what the situation of these products was like in the world of Linux. The initial situation yielded one of the most desolate scenes in the migration project. There was a total lack of mature solutions in the open source world and in the Linux environment which could meet the requirements of GIS and CAD technology users in the CIT. An analysis of the solutions on the market and in the open source community led us to the following conclusions: dd In the CAD field, there were a few important projects in the open source software community but none of them was sufficiently mature to replace the commercial tools we were using. The most important projects were QCAD and SAG- CAD. dd Format-wise, there was no efficient, commonly used CAD standard. The use of commercial, closed source DWG formats was commonplace, which made it even more difficult to find an open source alternative. 49
51 Complete migration to open source software dd Desktop clients were used in the GIS and the CIT did not use GIS technology as map servers, spatial data bases, etc. dd GIS-wise, there were no applications which were mature, powerful and user-friendly enough to replace the commercial tools commonly used in the CIT. The most important applications were GRASS and JUMP. Although GRASS is an excellent software package, it is more raster than vector-oriented. Moreover, at the time, it did not have a user-friendly interface. JUMP was not powerful enough and had performance problems which led us to reject it. dd However, there was a large number of libraries and source code in the open source software community which were widely used by different applications (including commercial applications, such as the GDAL library which is used by ESRI technology) and which could be used as the starting point for developing applications. dd Format-wise on the GIS front, talk about Spatial Data Infrastruc- ture and the standards defined by the Open Geospatial Consortium (previously Open GIS Consortium), which allowed remote access to geographic data, became widespread. Moreover, the most widely used file formats had open specifications which meant that this was not a problem in the migration project. dd There was a European Community initiative, which in the end became the INSPIRE Directive, that centred on map access for member country authorities via OGC standards. The relationship between these two studies - needs and solutions available - led us to the following conclusions: dd In the case of the CIT, it was feasible to tackle the GIS and CAD migration together because the use of CAD systems as map editors was part of the process of the use of GIS tools. dd Although there were no applications which could directly re- place what we had, there was a great deal of knowledge (source code, libraries, projects ) in the open source soft- ware community which made building an open source tool to cover the needs of the vast majority of CIT users possible. dd The solution had to centre on new more efficient working methods, working with standards and remote data stored in spatial data bases. dd Although our efforts had to focus on the desktop, we also had to bear in mind the possible uses made of other GIS families, such as map servers, which could extend the opportunities for spatial data management. Thus, the development of a desktop client had to be tackled as being part of a larger system, i.e. a commitment to the so-called Spatial Data Infrastructure (SDI). In the case of the CIT, it was feasible to tackle the GIS and CAD migration together because the use of CAD systems as map editors was part of the process of the use of GIS tools This meant that the initial, major landmark to be developed during the gvpontis migration project was the development of a GIS application that complied with a series of basic requirements. dd It had to be an application that catered for the GIS and CAD worlds. d d It had to be open source. Our starting point was the knowledge hosted and built by the open source software community. This knowledge was going to be extended and promoted and thus had to be given back to this community. We decided on the GNU General Public Licence. d d It had to be multiplatform. The application had to run independently of the user s operating system. The main reason for this was the migration process, which was to last four years. During this time, users with different operating systems -Windows, Linux and Mac OS X- had to coexist in the CIT. This led us to take the decision to use Java as the programming language. 50
52 gvsig: description and justification of our initial situation 15 dd It had to follow the standards of the OGC and of the IN- SPIRE Directive (Initiative as it was then). d d It had to be extensible. The project was envisaged from the start as having a scaleable architecture so it could evolve towards new areas of use in geographic information. d d It had to have a user-friendly interface. The interface had to be easy for the users to learn so that the migration was as painless as possible and there were no misunderstandings with regard to the habits acquired during many years of using other applications. d d It had to be multilingual. The application had to be easy to translate to other languages. Initially it was available in Spanish, Valencian and English. The project to develop this tool, and the name of the tool itself, was called gvsig. 51
53 CHAPTER 16 gvsig: how it has evolved into the current solution The set up and monitoring process of the migration project in the GIS/CAD area during these four years up until now has been marked by constant technological and institutional evolution. The project has had a far greater impact on the open source software community and on the world of Geographic Information Systems than we could ever have imagined. This process can be analysed from a technical viewpoint, i.e. the quality of the product we have obtained, and from other similar or more important viewpoints, such as the project being adopted by other government bodies, its use in university degree courses and the creation of a prestigious business fabric which uses it. All these aspects have made gvsig into a flagship project, firstly for the CIT and secondly for the Valencian Regional Government and have made the Valencian Region an international benchmark in the field of open source geomatics The process from a technical viewpoint As a product, a series of landmarks have allowed the objectives and the gvsig application to evolve beyond our initially established aims. These landmarks correspond to the extension of gvsig to new branches of development. Although the development of gvsig was initially aimed at catering for the needs of the majority of CIT users, once this objective had been reached new development phases were launched to cater for the needs of all the users in the CIT (this ever increasing number of users grew in parallel to the evolution of the project and to the new needs which arose out of the extension of the use of this land management tool) and in the other Valencian Regional Ministries, which had also begun to use and integrate the project as part of their essential computer tools. These projects were divided into the following phases: d d Vector gvsig. Initial project which led to the creation of a GIS tool which had the basic tools to manage vector data. d d gvsig as an SDI client. In line with the European INSPIRE Directive, gvsig develops a series of standards for access, location and the search for spatial data. This section was extended with the development of tools which allow data to be published according to standards. 52
54 gvsig: how it has evolved into the current solution 16 d d gvsig+cad. This project aimed to eliminate having to use CAD tools independently from GIS tools by developing advanced map editing functions within gvsig itself. d d raster gvsig. As satellite images, orthophotos, etc. are now more economical and a wide range of raster data located in remote services accessible via OGC standards are now available, the opportunities for accessing raster data are far greater nowadays. This means we can consider developing raster data analysis functions within gvsig. Moreover, this line of work will allow us to integrate another open source software project centring on geographical analysis called SEXTANTE 1. d d 3D gvsig. This project aimed to add the third dimension to complement spatial analysis. Functions related to animation and simulation were also included in this area. d d advanced vector gvsig. This was the direct evolution of the first phase. Advanced map editing tools in vector format, spatial symbol building and the ability to automatically generate reports were included in gvsig. d d mobile gvsig. gvsig for mobile devices to complement the desktop client. This allowed users to take gvsig into the field to collect and check data. At the present time, gvsig is a complete, highly powerful vector GIS which allows users to work with the most popular vector and raster data formats used in cartography. The vector formats it supports are:.shp (shape),.dxf (AutoCAD exchange format),.dwg (AutoCAD format),.dgn (MicroStation format) and.gml (standard). It also supports spatial data bases such as PostGIS, MySQL, ArcSDE and Oracle Spatial. The tools gvsig offers include data loading, navigation (zooms, panning, etc.), information requests (information about an element, distance measurement, etc.), theme mapping (legends by unique values, by intervals, self-labelling, etc.), element selection (graphic selection, selection by attributes, spatial selection, etc.) tables (statistics, sorting, relating tables, linking tables, etc.), map builder, geoprocessing tools, etc. In short, everything required to work with vector data. The gvsig project continues to work to improve this line of work by developing advanced vector GIS tools which will make gvsig more powerful in this field. We will see new symbol and advanced labelling tools including a symbol builder, network management, diagrams, reports, support for new formats such as MapInfo formats, connection to ArcSDE, etc. The technical abilities we achieved with each one of these areas is detailed below gvsig as a vector GIS tool As mentioned above, the first phase tackled in the gvsig project was to cater for the needs of an average vector Geographic Information System (GIS) user in the CIT. These needs were progressively catered for in the first two years. The most frequently used tools were tackled first. When this had been achieved, we went on to implement less frequently used tools. 1 SEXTANTE (Extremadura Spatial Analysis System) is a project developed by the University of Extremadura and funded by the Extremadura Regional Government. It is distributed under a GNU General Public Licence. >> gvsig view with vector data access. 53
55 Complete migration to open source software gvsig as a Spatial Data Infrastructure client. INSPIRE The gvsig is a firm advocate of interoperability and following standards and has been from the start. In the case of spatial information, these are marked by the OGC 2. The consortium is mainly made up of companies, government organisations and universities. The gvsig project, through the CIT, is a member of the consortium. gvsig was one of the first GIS tools to begin implementing these standards, ahead of widely used commercial applications in many cases. Following standards, under the principles of interoperability, is fundamental to meet the European INSPIRE Directive. The gvsig project works closely with the General Directorate which promotes INSPIRE 3 (Joint Research Centre). The INSPIRE Directive recognises the fact that the majority of quality spatial information is produced and available at local and regional level but that this data is difficult to use in a wider context. The most significant problems facing spatial data in Europe include fragmentation, a lack of homogeneous data between member states, efforts are duplicated when creating data, problems when identifying existing data and accessing the data which is available. INSPIRE has defined a series of basic principles to tackle these problems. These principles shall be followed when building the system and the data included in it. 2 The OGC (Open Geospatial Consortium) is an industrial consortium whose members work together by consensus to improve and achieve interoperability in the field of geotechnology. The OGC s vision is of a world in which everybody benefits from available services and spatial data via any network, application or platform. The OGC s mission is to provide spatial interfaces and open source code specifications which are publicly available for general use (OpenGIS Specifications). This means that the OGC does not produce standards, it produces technical specifications (like W3C). The OGC was created in 1994 to help the GIS industry get away from the isolation it was subject to within the general context of information technology. 3 The main objective of the Directive 2007/2/EC of the European Parliament and of the Council of 14 March 2007 establishing an Infrastructure for Spatial Information in the European Community (INSPIRE) is to make relevant, homogeneous, high quality geographic information available in order to support the formulation, implementation and evaluation of community policies. These principles are summarised in the following figure: >> INSPIRE Principles dd Data should be collected once and maintained at the level where this can be done most effectively. dd It must be possible to combine seamlessly spatial information from different sources across Europe and share it between many users and applications. dd It must be possible for information collected at one level to be shared between all the different levels, e.g. detailed for detailed investigations, general for strategic purposes. dd Geographic information needed for good governance at all levels should be abundant and widely available under conditions that do not restrain its extensive use. dd It must be easy to discover which geographic information is available, fits the needs for a particular use and under what conditions it can be acquired and used. dd Geographic data must become easy to understand and interpret because it can be visualised within the appropriate context and selected in a user-friendly way. Under these principles, INSPIRE envisages a distributed network of information repositories, unified by standards and protocols which ensure the compatibility and interoperability of geospatial data and services. This means that because the electronic data and services are placed in regional and national organisations and are implemented using common standards, they should therefore be easily accessible and can be combined regardless of administrative frontiers to create the technical part of the SDI. A working group is in charge of defining INSPIRE s architecture and standards in order to harmonise the different national SDIs. In Spain, this working group is coordinated by the National Geographic Institute. The gvsig project, through the CIT, is an official member 54
56 gvs I G: how it has evolved into the current solution dd WCS is the acronym of Web Coverage Service. In this case, the data are raster layers in original GIS formats. gvsig can load these layers, which are normally satellite images or orthophotos, and work with them as though they were normal raster layers. of this working group and of some important sub-groups, such as the UNSDI (United Nations Spatial Data Infrastructure) group. Within the INSPIRE architecture, gvsig plays the role of an important or rich client for accessing spatial data. Thus the fact that gvsig is an SDI client means it can add to, combine with local information and work with remote layers from different origins in any of the variants proposed by the Open Geospatial Consortium (OGC), WMS, WFS and WCS. As gvsig is an advanced SDI client, it forms part of a family of programmes which allow the SDI system to be set up as open source software. These applications include MapServer, GeoServer, Deegree, PostGIS and Geonetwork, which together with gvsig, provide us with a wide range of options to be able to choose noncommercial software. In addition to these services in Spatial Data Infrastructure, we can find what are called discovery services, which as their name indicates, are used to find information that complies with certain search criteria. gvsig was developed with the idea of being an Integrated Geospatial Information System. This means that gvsig has a wide variety of tools to analyse, manage and work with all kinds of geospatial information. Being an SDI client means that the origin of the data is not important when we apply these tools and we can work with both remote and local data. There are two discovery services for SDIs, both of which are implemented in gvsig: gvsig is compatible with several OGC interface specifications: WMS, WFS, WCS, catalogue and gazetteer. dd WMS is the acronym of Web Map Service. It produces spatial data maps which are dynamically linked using geographic information. This international standard defines a map as a representation of geographic information in a digital image file which can be displayed on a computer screen. A map does not include the actual data. The maps produced by WMS are usually generated in an image format such as PNG, GIF or JPEG. We can access these WMS services in gvsig and load these map images as another layer. >> gvsig view with access to different WMS standard services. dd WFS is the acronym of Web Feature Service. If the WMS uses raster formats (PNG, GIF, JPEG) to share the layers, the WFS standard uses GML, Geography Markup Language. The WFS allows advanced access to vector data, which in gvsig means we can work with data as though it was local vector data to carry out analyses, theme legends, geoprocessing, etc. 1. Catalogue service. This allows us to search for map resources using key fields, such as name, scale, theme, etc., which return a list of coincident metadata (data defining map resources). These resources can be accessed directly (gvsig loads them as a layer) or indirectly by showing a reference of how to obtain this resource. Thus, when we use gvsig as a catalogue client and 55 16
57 Complete migration to open source software introduce search criteria, the application returns the resources, located in the specific server, which comply with these criteria. 2. Gazetteer service. A gazetteer, in our case, is a list of georeferenced place names, i.e. a list in which each place name contains information about its geographic coordinates. gvsig can use the gazetteer service to search for the location of a particular place name. The application returns a zoom of the geographic area the place name refers to. In addition to working with standards, gvsig also supports nonstandard services, such as ESRI s ArcIMS and ECWP. gvsig allows the different SDI services to interoperate within an advanced GIS client by providing users with the tools required to cater for basic queries through to complex spatial analysis requests. The gvsig continues to improve the SDI client line of work to support new standards and make it easier to work with the other parts of an SDI. The WFS-T standard and an extension of publications have recently been developed. Their objective is to be able to export gvsig views to the different open source map servers on the market according to OGC standards gvsig as a map editor As mentioned previously, a CAD programme is a computer-aided design programme. As such, a CAD programme has a wide variety of uses including industrial design, architecture and map editing. In gvsig, the objective was not to create a CAD programme but to implement the tools required to carry out strict map editing in the application and eliminate the dependence on a CAD programme. gvsig therefore, has vector editing tools which allow elements to be modified, created and deleted. gvsig can edit a shape file, a layer in our spatial data base or a CAD file. gvsig aims to bear the end user in mind at all times and tries to make the different functions user-friendly and similar to what the user already knows. The CAD part of the application has a command console which allows users to work in a very similar fashion to some of the most widely used programmes on the market. gvsig includes drawing aid tools, from grids and undo commands, such as the command stack, through to complex element selections (inside a circle, outside a rectangle, etc.). gvsig has tools for inserting elements, such as points, polygons, lines, ellipses, etc. and for modifying these elements, such as rotating elements and symmetry. It is interesting to note at this point that the worlds of GIS and CAD are normally seen to be opposites when they are in fact complementary. Thus, gvsig aims to integrate them. Future versions will feature new CAD tools which complete the current functions (stretching, clipping, auto-completing polygons, etc.) and consolidate gvsig as a leading tool for precision map editing gvsig as a raster GIS and remote sensing tool Initially, gvsig had few raster GIS tools. In the latest versions these have increased considerably. gvsig currently has basic raster GIS functions, viewing and visual analysis functions (histograms, filters, colour tables, etc.), digital image processing (map algebras, transformation functions, image merger, etc.), spatial analysis functions (statistical functions, creation of digital terrain models, surface interpolation, image profiles, etc.) and temporal/multi/ hyperspectral analysis functions. gvsig has been prepared, as part of the raster SIG, to be integrated with the SEXTANTE project. This collaboration has given gvsig over 200 additional analysis functions, mainly aimed at the study of digital terrain models and hydrology. This software was initially developed on the SAGA core, and has migrated to gvsig in its latest versions, thus mutually enriching both projects. SEXTANTE processes geographic information and centres mainly on modelling and data analysis using raster images, although it also has many functions for working with vector data. 56
58 gvsig: how it has evolved into the current solution 16 A first pilot version is available and gradually over the next few months, new versions of this version of gvsig for mobile devices will be published. There is a public road map on the gvsig web site which describes all the functions which have already been implemented and those which are currently being implemented in gvsig: Likewise, all the technical process, which includes tasks such as development, testing, documentation, etc. is documented and publicly available on the following web site: >> SEXTANTE in gvsig. Its uses include: point analysis, pattern analysis, basic hydrological analysis, cell statistics for multiple raster layers, nearest neighbour statistics for a raster layer, geostatistics, geomorphometry and relief analysis, basic tools to analyse and calculate raster layers, tools for discreet layers and categorical information, lighting and visibility, profiles, vegetation and hydrological indices, etc. The list of these functions is added to with every new version gvsig as a 3D land management tool This area of the gvsig project aims to provide 3D and animation functions so that it becomes a tool capable of presenting and analysing information in an effective, attractive manner. gvsig is thus being extended so that it can work in 3D with massive raster and vector data, including remote services and common 3D object formats, and with vector images and data organised in time series, or with attributes that indicate their time range, which are easier to understand in an animated presentation mobile gvsig: a GIS for mobile devices This is the last line of work to be started in the gvsig project. It aims to give gvsig the necessary functions to become integrated in mobile devices, such as mobile telephones, PDAs and PC Tablets. >> Spherical and flat 3D views in gvsig The process from a non-technical viewpoint We shall now detail other related gvsig tasks: 57
59 Complete migration to open source software gvsig symposia dd gvsig symposia. Annual symposia on the project as a meeting point for gvsig s user and developer community. The gvsig symposia have been held on three occasions, once a year since the project was started. The fourth symposia will be held in December 2008 under the slogan Moving forward together. dd Diffusion. This line of work aims to publicise the gvsig project in the traditional GIS user world and in other areas which are potential users but which have not come across and used this type of technology. This includes the publication of articles, latest news, the presentation of papers at congresses, etc. dd Creation of spaces for the community. This aims to find spaces for the community to get in contact: the creation of mailing lists, web sites for cataloguing unofficial projects, etc. dd Inter-authority relations. Relations with other authorities and universities that wished to use the project, which in some cases have been formalised under agreements. Each of these sections entails a series of secondary tasks and procedures which are documented and are accessible to users with permissions on the web site: >> 3rd gvsig Symposia. Valencia Conference Centre. The symposia were planned as an annual meeting point for the gvsig community and to reflect on the progress made in the project. The number of participants, and their diverse origins, has grown considerably year after year. Participants at the first symposia came mainly from the Valencian Region whilst in the third symposia, the visitors came from a wide variety of countries, such as Germany, France, Venezuela and Italy. The first symposia consisted of presenting the gvsig whilst last year, presentations featured a wide range of experiences in using the project, including archaeology by the University of Valencia, town planning management in Extremadura and water management in Tanzania in areas in which access to water is difficult. Around a thousand people are expected to attend this year s symposia. The study of the results obtained from these tasks gives us some indication of the evolution of the project: >> Map editing in gvsig. In the words of Mr. Gaspar Peral Ribelles (Undersecretary of the Valencian Regional Ministry of Infrastructure and Transport): In just four years the gvsig symposia have become an international benchmark for the world of Geographic Information Systems and Spatial Data Infrastructure. 58
60 gvsig: how it has evolved into the current solution participants at the gvsig symposia The gvsig symposia have been visited by illustrious speakers such as Max Craglia, the head of INSPIRE at the European Commission s Joint Research Centre. All the information about the 4 th gvsig symposia, and the previous symposia, can be found at: gvsig diffusion st gvsig symposia Sharing Knowledge 2nd gvsig symposia Building Realities 3rd gvsig symposia Consolidate & Move Forward Another of the indicators of the success of the symposia, and by extension of the project, is the number of sponsors and partners it has. This is an important indicator because it marks the growth of the business fabric directly related with the inclusion of gvsig as part of a business model adopted by companies. The diffusion of gvsig has been vital in extending the project s use. This has involved a wide variety of actions, such as: d d Attending events. Congresses, symposia, seminars, workshops, talks at universities, etc. The significance of the gvsig project, with regard to its participation in events, has increased hand in hand with the applications functions. Although participation centred on domestic events during the first two years, gvsig has been presented in several countries around the world over recent years, and on many occasions as the main project. All the papers are available at: sponsors and partners at the gvsig symposia 1st gvsig symposia 2nd gvsig symposia 3rd gvsig symposia dd Publication of articles. The main national and international GIS technology and software magazines have featured articles about the gvsig phenomenon, as it was labelled in Directions Magazine. These magazines include: AutoCAD Magazine (Spain), Mapping (Spain and Latin America), GIS Business (Germany), GIS Development (world), GEOinformatics (Europe), GIM internacional (Europe and North America), TodoLinux (Spain), Revista IGP (Portugal), etc. A chapter was given over to the project in the book Jornadas de Infraestructuras de Datos Espaciales de España 2006 and d d Press articles. gvsig has appeared in different, mainly specialised magazines and journals on a constant basis. Half a million entries in Google is an indicator of this and it is esti- 59
61 Complete migration to open source software dd In the third stage, it began to be used in an ever larger number of government authorities on the world stage: the Munich City Council, the National Centre of Information Technology in Venezuela, the Agustín Codazzi National Geographic Institute in Colombia, the Military Geographic Institute in Argenmated that it has also appeared in over 100 different online and written media. d d Google Trends. A Google tool that marks user search trends and the appearance of a term, in our case gvsig, on the net, is another indicator to be looked at to see how well the project has been publicised. The following graph shows a trend comparison with one of the most widely used, classic GIS commercial programmes (Geomedia). managed by the community itself (it already has over a hundred users). d d Creation of a project catalogue. A space for those who wish to make unofficial contributions to the project (developments, manuals, etc.) and post them to the community. The project catalogue can be consulted at: dd Support for internationalisation. Community volunteers who wish to have gvsig in their own language can do so very simply. They are sent files and instructions from the project so that they can translate them. gvsig is currently available in over 10 languages (English, Spanish, Valencian, French, Portuguese, Chinese, German, etc.) and the application is being translated into 10 more languages (Russian, Japanese, Greek, etc.) Inter-authority relations >> gvsig (blue) compared to Geomedia (red) in Google Trends Creation of spaces for the gvsig community The following lines of work have been chosen: d d Opening the project up to the community, not only by distributing binary and source codes, but also by providing access to all the possible information. This information can be found on the following two web sites: y dd Finding spaces to exchange information and support. Three official lists were created for this purpose: an international list in English (500 users) and two in Spanish for users and developers (over a thousand users). A space has recently been created for the project list in Italian which is The project has evolved as follows: dd At the first stage, gvsig was used by a few users in the CIT. This gradually grew to considerably outnumber the initial GIS users: initially there were 90 users, now there are over 400 in the CIT. dd In the second phase, the project began to be used in other Valencian Regional Ministries, such as the Ministry of the Environment, Water, Town Planning and Housing, the Ministry of Industry, Trade and Innovation and the Ministry of Education. In short, it became the Valencian Regional Government s open source GIS. 60
62 gvsig: how it has evolved into the current solution 16 tina, the EU s Joint Research Centre, the Metropolitan Body for Hydraulic Services in Valencia (EMSHI), the Valencia City Council, the Spanish Ministry of Development, the Basque Regional Government, the Spanish Association of Land Registrars, the Inter-Island Council of La Palma, etc. In addition to being implemented in government authorities, gvsig is also becoming a regular feature in universities. Some of the universities directly involved in the project include the Universidad Politécnica de Valencia, the Jaume I University, the University of Castile-La Mancha, the Technology University of Madrid, the University of Girona, the Open University of Catalonia, the University of Patagonia, the University of Rennes and the University of Salzburg. Other organisations, such as Engineering without Borders and Geographers without Borders, use the project in cooperation programmes they carry out in Africa and Latin America. This has entailed a series of tasks aimed at maintaining cooperation relations with these bodies. Many of these users do not only use gvsig as such, they also develop customisations and improvements to the application to cater for their own particular needs, either on their own initiative or through other companies. An example of this has taken place in the CIT itself, in which gvsig has been customised for the Ports and Coastal Division for coastal management and for the Road Safety Department to manage the road network. The following map shows the countries in which the gvsig application has been downloaded: >> Countries with gvsig downloads (in yellow). 61
63 CHAPTER 17 gvsig: conclusions The dd dd main conclusions we can draw are that: The number of users of GIS/CAD technologies in the CIT has increased from 90 to 400. User needs have not only been catered for but the GIS has become a widely used tool in the CIT. The level of use of GIS in the CIT has increased. Current users have more and more tools at their disposal, and although 90% of the users used 10% of the tools at the beginning of the project, this percentage is increasing and more and more functions are being used. The number of users has increased hand in hand with the increase in their degree of knowledge. dd The gvsig project has led to the knowledge of Spatial Data Infrastructure within the CIT and it has implemented its own tool as a Spanish SDI node in line with the European INSPIRE Directive. This has optimised spatial data management, from working with files to working with spatial data bases and standards. dd The project which was firstly taken on board by the Valencian Regional Government and then by the open source software community, has become an international benchmark. dd The Valencian Region has become a benchmark for open source geomatics alongside the Valencian companies and universities which are developing the project. 62
64 Future lines of work CHAPTER 18 Future lines of work follow two basic premises: d d To continue to improve gvsig. Adding new tools which work alongside the current tools, such as topography application tools, road network management tools, geostatistics and sensors, which will increase the use of these technologies even further, and the optimisation of spatial management, which is so essential for public authorities and particularly for the CIT. d d Encouraging collaboration. To continue to boost the gvsig community and the participation of more and more players who work together to build the project. Once again, in the words of Mr. Gaspar Peral Ribelles: The slogan for these symposia is: gvsig. Moving forward together. The previous symposia s slogan was Consolidate and Move Forward. In these symposia we discussed the fact that gvsig had grown significantly and quickly and that it was necessary to control and consolidate this growth so that we could continue to move forward successfully. We are now ready to do this. Our aim now is not to go over what has already been done, but to build the future of gvsig together. We want the gvsig community to see each other, talk to each other, to exchange ideas and opinions, so that gvsig keeps moving forward. But our aim is to do this together, as our slogan clearly shows. 63
65 References dd Valencian Regional Ministry of Infraestructure and Transport dd gvpontis project dd gvsig dd MOSKitt dd IDABC dd Wikipedia 64
66 Glossary ACID BDC BIND BPMN BPMS Atomicity, Consistency, Isolation, Durability is a set of properties that guarantee that database transactions are processed reliably. Backup Domain Controllers. The Berkeley Internet Name Domain, a DNS server. Business Process Modeling Notation (BPMN) is a graphical representation for specifying business processes in a workflow. Business Process Management (BPM) is a method of efficiently aligning an organization with the wants and needs of clients. GNU/GPL GPL HTML ID The GNU General Public License (GNU GPL or simply GPL) is a widely used free software license, originally written by Richard Stallman for the GNU project. The GNU General Public License (GNU GPL or simply GPL) is a widely used free software license, originally written by Richard Stallman for the GNU project. HyperText Markup Language is the predominant markup language for Web pages. Identification. CAD Computer Aided Design. IDE Integrated Development Environment. CASE Computer-aided Software Engineering. IPP Internet Printing Protocol. CIT CMS CUPS CVS DBMS Valencian Regional Ministry of Infrastructure and Transport. Content Management System. Common Unix Printing System. Concurrent Versions System. Database Management System. JDBC JSF KDE LAMP Java Database Connectivity. JavaServer Faces (a specification for developing Javabased Web applications). K Desktop Environment is a Free software project based around its flagship product - a desktop environment for Unix-like systems. An open source web platform consisting of Linux, Apache, MySQL and Perl/PHP/Python. DNS EMP ESRI Domain Name System. Eclipse Modeling Project, a software project. Is a software development and services company providing Geographic Information System (GIS) software and geodatabase management applications. LAN LDAP MDA Local Area Network. The Lightweight Directory Access Protocol, is an application protocol for querying and modifying directory services running over TCP/IP. Model-driven Architecture. GB GIS Gigabyte, a unit of information used, for example, to quantify computer memory or storage capacity. Geographic Information System. MDD MVC Model-driven Development. Model-view-controller, an architectural pattern in computer software development. 65
67 MySQL ODBC ODBC/OLE OGC A relational database management system. Open Database Connectivity. Object Linking and Embedding, Database, sometimes written as OLEDB or OLE-DB. Open Geospatial Consortium. SME SQL Small and Medium Enterprises. Structured Query Language is a database computer language designed for the retrieval and management of data in relational database management systems, database schema creation and modification, and database object access control management. OLTP PC PDC Online transaction processing, or OLTP, refers to a class of systems that facilitate and manage transaction-oriented applications, typically for data entry and retrieval transaction processing. Personal Computer. Primary Domain Controler. TCP/IP UML UTF8 The Internet Protocol Suite (commonly TCP/IP) is the set of communications protocols used for the Internet and other similar networks. Unified Modeling Language, an object modeling and specification language used in software engineering. (8-bit UCS/Unicode Transformation Format) is a variable-length character encoding for Unicode. PDF PHP PostgreSQL Portable Document Format. PHP Hypertext Pre-processor (an open source programming language). An object-relational database management system. VNC Virtual Network Computing is a graphical desktop sharing system which uses the RFB protocol to remotely control another computer. It transmits the keyboard and mouse events from one computer to another, relaying the graphical screen updates back in the other direction, over a network. SATA The Serial Advanced Technology computer bus has the primary function of transfering data between the motherboard and mass storage devices (such as hard disk drives and optical drives) inside a computer. WAN WCS WFS Wide Area Network. Web Coverage Service. Web Feature Service. SDI A Spatial Data Infrastructure is a framework of spatial data, metadata, users and tools that are interactively connected in order to use spatial data in an efficient and flexible way. WMS XML Web Map Service. Extensible Markup Language (a general-purpose specification for creating custom markup languages). 66
OXAGILE RESUMES SUMMARY OF QUALIFICATIONS TECHNICAL SKILLS SENIOR JAVA SOFTWARE ENGINEER
OXAGILE RESUMES SENIOR JAVA SOFTWARE ENGINEER SUMMARY OF QUALIFICATIONS Over 4 years of solid experience in software development, application programming and engineering Strong expertise in J2EE architectures,
Case Study. Data Governance Portal. www.brainvire.com 2013 Brainvire Infotech Pvt Ltd Page 1 of 1
Case Study Data Governance Portal www.brainvire.com 2013 Brainvire Infotech Pvt Ltd Page 1 of 1 Client Requirement The website is the Data Governance intranet portal. Data Governance is the practice of
DTWMS Required Software Engineers. 1. Senior Java Programmer (3 Positions) Responsibilities:
DTWMS Required Software Engineers 1. Senior Java Programmer (3 Positions) Responsibilities: Responsible to deliver quality software solutions using standard end to end software development cycle Collaborate
A Database Re-engineering Workbench
A Database Re-engineering Workbench A project proposal by Anmol Sharma Abstract Data is not always available in the best form for processing, it is often provided in poor format or in a poor quality data
IBM Rational Web Developer for WebSphere Software Version 6.0
Rapidly build, test and deploy Web, Web services and Java applications with an IDE that is easy to learn and use IBM Rational Web Developer for WebSphere Software Version 6.0 Highlights Accelerate Web,
Web. Studio. Visual Studio. iseries. Studio. The universal development platform applied to corporate strategy. Adelia. www.hardis.
Web Studio Visual Studio iseries Studio The universal development platform applied to corporate strategy Adelia www.hardis.com The choice of a CASE tool does not only depend on the quality of the offer
Bureau for Visual Affairs. content management system. Keep your website up-to-date and relevant with ease
content management system Keep your website up-to-date and relevant with ease 1 Only an up-to-date and well maintained website is perceived as relevant and will generate return visits and involvement.
Service Management Simplified
Service Management Simplified TOPdesk develops, markets, implements and supports software which helps organisations to efficiently manage the services they provide. Our vision is to create a user-friendly
To introduce software process models To describe three generic process models and when they may be used
Software Processes Objectives To introduce software process models To describe three generic process models and when they may be used To describe outline process models for requirements engineering, software
The challenges of becoming a Trusted Digital Repository
The challenges of becoming a Trusted Digital Repository Annemieke de Jong is Preservation Officer at the Netherlands Institute for Sound and Vision (NISV) in Hilversum. She is responsible for setting out
Open Source BI Platforms: a Functional and Architectural Comparison Matteo Golfarelli DEIS University of Bologna Agenda: 1. Introduction 2. Conduct of the Comparison 3. Platforms description 4. Discussion
WHITE PAPER DATA GOVERNANCE ENTERPRISE MODEL MANAGEMENT
WHITE PAPER DATA GOVERNANCE ENTERPRISE MODEL MANAGEMENT CONTENTS 1. THE NEED FOR DATA GOVERNANCE... 2 2. DATA GOVERNANCE... 2 2.1. Definition... 2 2.2. Responsibilities... 3 3. ACTIVITIES... 6 4. THE
«COSWIN 7i helps you increase your return on assets while boosting their productivity.»
COSWIN 7i is the new generation asset and facilities management software developed by SIVECO GROUP. COSWIN 7i helps companies to increase their profitability - by improving the management of corporate
Software Engineering. Software Processes. Based on Software Engineering, 7 th Edition by Ian Sommerville
Software Engineering Software Processes Based on Software Engineering, 7 th Edition by Ian Sommerville Objectives To introduce software process models To describe three generic process models and when
Oracle Application Development Framework Overview
An Oracle White Paper June 2011 Oracle Application Development Framework Overview Introduction... 1 Oracle ADF Making Java EE Development Simpler... 2 THE ORACLE ADF ARCHITECTURE... 3 The Business Services
Data processing goes big
Test report: Integration Big Data Edition Data processing goes big Dr. Götz Güttich Integration is a powerful set of tools to access, transform, move and synchronize data. With more than 450 connectors,
Database-driven library system
Database-driven library system Key-Benefits of CADSTAR 12.1 Characteristics of database-driven library system KEY-BENEFITS Increased speed when searching for parts You can edit/save a single part (instead
Papermule Workflow. Workflow and Asset Management Software. Papermule Ltd
Papermule Workflow Papermule Workflow - the power to specify adaptive and responsive workflows that let the business manage production problems in a resilient way. Workflow and Asset Management Software
BCS HIGHER EDUCATION QUALIFICATIONS Level 6 Professional Graduate Diploma in IT. March 2013 EXAMINERS REPORT. Software Engineering 2
BCS HIGHER EDUCATION QUALIFICATIONS Level 6 Professional Graduate Diploma in IT March 2013 EXAMINERS REPORT Software Engineering 2 General Comments The pass rate this year was significantly better than
Management procedures for development NGO that work with headquarters in developing countries
Management procedures for development NGO that work with headquarters in developing countries Abstract José R. Cobo Benita 1 (Professor of Project Engineering, Polytechnic University of Madrid, Spain)
Open Source Business Intelligence Intro
Open Source Business Intelligence Intro Stefano Scamuzzo Senior Technical Manager Architecture & Consulting Research & Innovation Division Engineering Ingegneria Informatica The Open Source Question In
PLANNING AND DEVELOPMENT DEPARTMENT OUTLINE FUNCTIONAL SPECIFICATION FOR A BUSINESS OBJECTS DASHBOARD TOOL MAY 2011
PLANNING AND DEVELOPMENT DEPARTMENT OUTLINE FUNCTIONAL SPECIFICATION FOR A BUSINESS OBJECTS DASHBOARD TOOL MAY 2011 TABLE OF CONTENTS 1. INTRODUCTION... 3 1.1 BACKGROUND... 3 1.2 PURPOSE... 4 1.3 RESPONSE
1 File Processing Systems
COMP 378 Database Systems Notes for Chapter 1 of Database System Concepts Introduction A database management system (DBMS) is a collection of data and an integrated set of programs that access that data.
PROVIDING INSIGHT FOR OPERATIONAL SUCCESS
idashboards for Financial Services PROVIDING INSIGHT FOR OPERATIONAL SUCCESS We had a huge leap in account openings once staff could see how their sales compared to other staff and branches. They now have
THE BRITISH LIBRARY. Unlocking The Value. The British Library s Collection Metadata Strategy 2015-2018. Page 1 of 8
THE BRITISH LIBRARY Unlocking The Value The British Library s Collection Metadata Strategy 2015-2018 Page 1 of 8 Summary Our vision is that by 2020 the Library s collection metadata assets will be comprehensive,
Introductory Concepts
Introductory Concepts 5DV119 Introduction to Database Management Umeå University Department of Computing Science Stephen J. Hegner [email protected] http://www.cs.umu.se/~hegner Introductory Concepts 20150117
Top 10 Oracle SQL Developer Tips and Tricks
Top 10 Oracle SQL Developer Tips and Tricks December 17, 2013 Marc Sewtz Senior Software Development Manager Oracle Application Express Oracle America Inc., New York, NY The following is intended to outline
Business Process Management with @enterprise
Business Process Management with @enterprise March 2014 Groiss Informatics GmbH 1 Introduction Process orientation enables modern organizations to focus on the valueadding core processes and increase
High Level Design Distributed Network Traffic Controller
High Level Design Distributed Network Traffic Controller Revision Number: 1.0 Last date of revision: 2/2/05 22c:198 Johnson, Chadwick Hugh Change Record Revision Date Author Changes 1 Contents 1. Introduction
BEST WEB PROGRAMMING LANGUAGES TO LEARN ON YOUR OWN TIME
BEST WEB PROGRAMMING LANGUAGES TO LEARN ON YOUR OWN TIME System Analysis and Design S.Mohammad Taheri S.Hamed Moghimi Fall 92 1 CHOOSE A PROGRAMMING LANGUAGE FOR THE PROJECT 2 CHOOSE A PROGRAMMING LANGUAGE
Modernized and Maintainable Code. Frank Weil, Ph.D. UniqueSoft, LLC
Modernized and Maintainable Code Frank Weil, Ph.D. UniqueSoft, LLC UniqueSoft is a provider of next-generation software development tools and services specializing in modernizing legacy software using
The Integration Between EAI and SOA - Part I
by Jose Luiz Berg, Project Manager and Systems Architect at Enterprise Application Integration (EAI) SERVICE TECHNOLOGY MAGAZINE Issue XLIX April 2011 Introduction This article is intended to present the
bitmedia Access 2007 Basics Entry test Database Basics Entry test Basic database terms What is Access 2007? Tables and indexes
bitmedia Access 2007 Basics Databases such as Access are often considered by some to live in the shadows of the Microsoft Office Package. This is, as we hope to demonstrate in the course of this module,
What is Enterprise Architect? Enterprise Architect is a visual platform for designing and constructing software systems, for business process
1 2 3 What is Enterprise Architect? Enterprise Architect is a visual platform for designing and constructing software systems, for business process modeling, and for more generalized modeling purposes.
Oracle SQL Developer Migration. An Oracle White Paper September 2008
Oracle SQL Developer Migration An Oracle White Paper September 2008 Oracle SQL Developer Migration Overview... 3 Introduction... 3 Supported Databases... 4 Architecture... 4 Migration... 4 Standard Migrate...
WHITE PAPER. Domo Advanced Architecture
WHITE PAPER Domo Advanced Architecture Overview There are several questions that any architect or technology advisor may ask about a new system during the evaluation process: How will it fit into our organization
Oracle SQL Developer for Database Developers. An Oracle White Paper June 2007
Oracle SQL Developer for Database Developers An Oracle White Paper June 2007 Oracle SQL Developer for Database Developers Introduction...3 Audience...3 Key Benefits...3 Architecture...4 Key Features...4
Compared to MySQL database, Oracle has the following advantages:
To: John, Jerry Date: 2-23-07 From: Joshua Li Subj: Migration of NEESit Databases from MySQL to Oracle I. Why do we need database migration? Compared to MySQL database, Oracle has the following advantages:
Postgres Plus xdb Replication Server with Multi-Master User s Guide
Postgres Plus xdb Replication Server with Multi-Master User s Guide Postgres Plus xdb Replication Server with Multi-Master build 57 August 22, 2012 , Version 5.0 by EnterpriseDB Corporation Copyright 2012
Framework as a master tool in modern web development
Framework as a master tool in modern web development PETR DO, VOJTECH ONDRYHAL Communication and Information Systems Department University of Defence Kounicova 65, Brno, 662 10 CZECH REPUBLIC [email protected],
A Modular Approach to Teaching Mobile APPS Development
2014 Hawaii University International Conferences Science, Technology, Engineering, Math & Education June 16, 17, & 18 2014 Ala Moana Hotel, Honolulu, Hawaii A Modular Approach to Teaching Mobile APPS Development
PRINCIPAL JAVA ARCHITECT JOB ID: WD001087
PRINCIPAL JAVA ARCHITECT JOB ID: WD001087 The Principal Java Architect will lead/participate in the design, development, maintenance, and enhancements of worldwide business applications and Westum Products.
A Structured Methodology For Spreadsheet Modelling
A Structured Methodology For Spreadsheet Modelling ABSTRACT Brian Knight, David Chadwick, Kamalesen Rajalingham University of Greenwich, Information Integrity Research Centre, School of Computing and Mathematics,
Customer Bank Account Management System Technical Specification Document
Customer Bank Account Management System Technical Specification Document Technical Specification Document Page 1 of 15 Table of Contents Contents 1 Introduction 3 2 Design Overview 4 3 Topology Diagram.6
SQL Maestro and the ELT Paradigm Shift
SQL Maestro and the ELT Paradigm Shift Abstract ELT extract, load, and transform is replacing ETL (extract, transform, load) as the usual method of populating data warehouses. Modern data warehouse appliances
THE BCS PROFESSIONAL EXAMINATIONS Diploma. April 2006 EXAMINERS REPORT. Systems Design
THE BCS PROFESSIONAL EXAMINATIONS Diploma April 2006 EXAMINERS REPORT Systems Design Question. a) Write a BRIEF explanation of the purpose of TWO of the following UML diagrams as used in Object- Oriented
Die Mobiliar Insurance Company AG, Switzerland Adaptability and Agile Business Practices
Die Mobiliar Insurance Company AG, Switzerland Adaptability and Agile Business Practices Nominated by ISIS Papyrus Software 1. EXECUTIVE SUMMARY / ABSTRACT The Swiss insurance company Die Mobiliar is the
Document management and exchange system supporting education process
Document management and exchange system supporting education process Emil Egredzija, Bozidar Kovacic Information system development department, Information Technology Institute City of Rijeka Korzo 16,
Introducing Ingres. HP OpenVMS Technical Update Days. Darren Horler. Manager, Engineering. October 2011
Introducing Ingres HP OpenVMS Technical Update Days Darren Horler Manager, Engineering October 2011 1 of 9 1 of 9 Agenda Ingres Becomes Actian Company Summary Why Are We Here? Ingres Business Capabilities
idashboards FOR SOLUTION PROVIDERS
idashboards FOR SOLUTION PROVIDERS The idashboards team was very flexible, investing considerable time working with our technical staff to come up with the perfect solution for us. Scott W. Ream, President,
In this Lecture you will Learn: Implementation. Software Implementation Tools. Software Implementation Tools
In this Lecture you will Learn: Implementation Chapter 19 About tools used in software implementation How to draw component diagrams How to draw deployment diagrams The tasks involved in testing a system
Curl Building RIA Beyond AJAX
Rich Internet Applications for the Enterprise The Web has brought about an unprecedented level of connectivity and has put more data at our fingertips than ever before, transforming how we access information
Sisense. Product Highlights. www.sisense.com
Sisense Product Highlights Introduction Sisense is a business intelligence solution that simplifies analytics for complex data by offering an end-to-end platform that lets users easily prepare and analyze
Oracle SQL Developer for Database Developers. An Oracle White Paper September 2008
Oracle SQL Developer for Database Developers An Oracle White Paper September 2008 Oracle SQL Developer for Database Developers Introduction...3 Audience...3 Key Benefits...3 Architecture...4 Key Features...4
FirstSpirit Training Program
FirstSpirit Training Program Qualified employees are the capital of successful companies and a central factor for optimum implementation of web projects. The e Spirit training program teaches your team
The case for service oriented architecture in realising trusted, interoperable, pan-european egovernment services.
The case for service oriented architecture in realising trusted, interoperable, pan-european egovernment services. Stephen McGibbon Microsoft EMEA Tel. +445511490070 Email. [email protected] Abstract:
Techniques for ensuring interoperability in an Electronic health Record
Techniques for ensuring interoperability in an Electronic health Record Author: Ovidiu Petru STAN 1. INTRODUCTION Electronic Health Records (EHRs) have a tremendous potential to improve health outcomes
Data Modeling Basics
Information Technology Standard Commonwealth of Pennsylvania Governor's Office of Administration/Office for Information Technology STD Number: STD-INF003B STD Title: Data Modeling Basics Issued by: Deputy
FileMaker Pro and Microsoft Office Integration
FileMaker Pro and Microsoft Office Integration page Table of Contents Executive Summary...3 Introduction...3 Top Reasons to Read This Guide...3 Before You Get Started...4 Downloading the FileMaker Trial
ANNEX A.1 TECHNICAL SPECIFICATIONS OPEN CALL FOR TENDERS F-SE-13-T01 WEB DEVELOPMENT SERVICES
ANNEX A.1 TECHNICAL SPECIFICATIONS OPEN CALL FOR TENDERS F-SE-13-T01 WEB DEVELOPMENT SERVICES Lot 1 Intranet Lot 2 Internet 1 Background information... 3 2 Scope of required services... 3 3 Definition
A Process for ATLAS Software Development
Atlas Software Quality Control Group A Process for ATLAS Software Development Authors : Atlas Quality Control Group M. Asai, D. Barberis (chairman), M. Bosman, R. Jones, J.-F. Laporte, M. Stavrianakou
MALAYSIAN PUBLIC SECTOR OPEN SOURCE SOFTWARE (OSS) PROGRAMME BENCHMARK/COMPARISON REPORT DOCUMENT MANAGEMENT SYSTEMS (NUXEO AND ALFRESCO)
MALAYSIAN PUBLIC SECTOR OPEN SOURCE SOFTWARE (OSS) PROGRAMME BENCHMARK/COMPARISON REPORT DOCUMENT MANAGEMENT SYSTEMS (NUXEO AND ALFRESCO) DECEMBER 2009 Copyright The Government of Malaysia retains the
How To Write An Ria Application
Document Reference TSL-SES-WP-0001 Date 4 January 2008 Issue 1 Revision 0 Status Final Document Change Log Version Pages Date Reason of Change 1.0 Draft 17 04/01/08 Initial version The Server Labs S.L
Selection and Management of Open Source Software in Libraries.
Selection and Management of Open Source Software in Libraries. Vimal kumar V. Asian School of Business Padmanabha Building Technopark, Trivandrum-695 581 [email protected] Abstract Open source software
A system is a set of integrated components interacting with each other to serve a common purpose.
SYSTEM DEVELOPMENT AND THE WATERFALL MODEL What is a System? (Ch. 18) A system is a set of integrated components interacting with each other to serve a common purpose. A computer-based system is a system
Installing and Administering VMware vsphere Update Manager
Installing and Administering VMware vsphere Update Manager Update 1 vsphere Update Manager 5.1 This document supports the version of each product listed and supports all subsequent versions until the document
EUROPASS DIPLOMA SUPPLEMENT
EUROPASS DIPLOMA SUPPLEMENT TITLE OF THE DIPLOMA (ES) Técnico Superior en Desarrollo de Aplicaciones Web TRANSLATED TITLE OF THE DIPLOMA (EN) (1) Higher Technician in Development of Web Applications --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
BUILDING OLAP TOOLS OVER LARGE DATABASES
BUILDING OLAP TOOLS OVER LARGE DATABASES Rui Oliveira, Jorge Bernardino ISEC Instituto Superior de Engenharia de Coimbra, Polytechnic Institute of Coimbra Quinta da Nora, Rua Pedro Nunes, P-3030-199 Coimbra,
Information Technology Services
Information Technology Services "improve your business performance with custom software solutions" ISO 90001:2008 Quality Management System Certified Company About Providence Providence is a well-established
zen Platform technical white paper
zen Platform technical white paper The zen Platform as Strategic Business Platform The increasing use of application servers as standard paradigm for the development of business critical applications meant
EnergySync and AquaSys. Technology and Architecture
EnergySync and AquaSys Technology and Architecture EnergySync and AquaSys modules Enterprise Inventory Enterprise Assets Enterprise Financials Enterprise Billing Service oriented architecture platform
Visual Paradigm product adoption roadmap
Visual Paradigm product adoption roadmap Model-Code-Deploy Platform Easy and Intelligent Business Process Modeler Streamlined Design and Development Environment Access Database with Object- Oriented Technology
Oracle to MySQL Migration
to Migration Stored Procedures, Packages, Triggers, Scripts and Applications White Paper March 2009, Ispirer Systems Ltd. Copyright 1999-2012. Ispirer Systems Ltd. All Rights Reserved. 1 Introduction The
Oracle SQL Developer Migration
An Oracle White Paper May 2010 Oracle SQL Developer Migration Overview... 3 Introduction... 3 Oracle SQL Developer: Architecture and Supported Platforms... 3 Supported Platforms... 4 Supported Databases...
Automating the Development of Information Systems with the MOSKitt Open Source Tool
http://www.moskitt.org Automating the Development of Information Systems with the MOSKitt Open Source Tool Vicente Pelechano Universidad Politécnica de Valencia Content PART I: About the Project and the
LECTURES NOTES Organisational Aspects of Software Development
LECTURES NOTES Organisational Aspects of Software Development Pedro Contreras Department of Computer Science Royal Holloway, University of London Egham, Surrey TW20 0EX, UK [email protected] 1. Introduction
Getting started with API testing
Technical white paper Getting started with API testing Test all layers of your composite applications, not just the GUI Table of contents Executive summary... 3 Introduction... 3 Who should read this document?...
Toad for Data Analysts, Tips n Tricks
Toad for Data Analysts, Tips n Tricks or Things Everyone Should Know about TDA Just what is Toad for Data Analysts? Toad is a brand at Quest. We have several tools that have been built explicitly for developers
MicroStrategy Course Catalog
MicroStrategy Course Catalog 1 microstrategy.com/education 3 MicroStrategy course matrix 4 MicroStrategy 9 8 MicroStrategy 10 table of contents MicroStrategy course matrix MICROSTRATEGY 9 MICROSTRATEGY
Benefits of databases
Page 1 of 7 Benefits of databases The gathering, processing, and use of information relating to the operations of a business are vital to its success. Even something as simple as a customer mailing list
GeoKettle: A powerful open source spatial ETL tool
GeoKettle: A powerful open source spatial ETL tool FOSS4G 2010 Dr. Thierry Badard, CTO Spatialytics inc. Quebec, Canada [email protected] Barcelona, Spain Sept 9th, 2010 What is GeoKettle? It is
The leading platform for Model Driven Architecture (MDA) Content:
The leading platform for Model Driven Architecture (MDA) Content: Models Made for Business... 2 ArcStyler Overview... 2 Main Benefits... 3 ArcStyler Editions... 4 ArcStyler Modules and Tool Architecture...
Requirements Management
REQUIREMENTS By Harold Halbleib Requirements Management Identify, Specify, Track and Control Requirements Using a Standard Process About the author... Harold Halbleib has a degree in Electrical Engineering
City of Ryde Drives Business Forward with Enterprise-wide Information Management Solution
Case Study City of Ryde Drives Business Forward with Enterprise-wide Information Management Solution Effective Case Management in HP TRIM Improves Business Processes, Builds Foundation for Single View
Agile Business Suite: a 4GL environment for.net developers DEVELOPMENT, MAINTENANCE AND DEPLOYMENT OF LARGE, COMPLEX BACK-OFFICE APPLICATIONS
Agile Business Suite: a 4GL environment for.net developers DEVELOPMENT, MAINTENANCE AND DEPLOYMENT OF LARGE, COMPLEX BACK-OFFICE APPLICATIONS In order to ease the burden of application lifecycle management,
Architecture Design & Sequence Diagram. Week 7
Architecture Design & Sequence Diagram Week 7 Announcement Reminder Midterm I: 1:00 1:50 pm Wednesday 23 rd March Ch. 1, 2, 3 and 26.5 Hour 1, 6, 7 and 19 (pp.331 335) Multiple choice Agenda (Lecture)
EMBL-EBI. Database Replication - Distribution
Database Replication - Distribution Relational public databases EBI s mission to provide freely accessible information on the public domain Data formats and technologies, should not contradict to this
Decomposition into Parts. Software Engineering, Lecture 4. Data and Function Cohesion. Allocation of Functions and Data. Component Interfaces
Software Engineering, Lecture 4 Decomposition into suitable parts Cross cutting concerns Design patterns I will also give an example scenario that you are supposed to analyse and make synthesis from The
Object Oriented Database Management System for Decision Support System.
International Refereed Journal of Engineering and Science (IRJES) ISSN (Online) 2319-183X, (Print) 2319-1821 Volume 3, Issue 6 (June 2014), PP.55-59 Object Oriented Database Management System for Decision
Database Resources. Subject: Information Technology for Managers. Level: Formation 2. Author: Seamus Rispin, current examiner
Database Resources Subject: Information Technology for Managers Level: Formation 2 Author: Seamus Rispin, current examiner The Institute of Certified Public Accountants in Ireland This report examines
