Friday, June 15, 2012

About me ...

                                      

                Last year I finished the faculty of Automation and Computer Science from Technical University of Cluj- Napoca. In the last year of faculty I attended with an Erasmus scholarship for my diploma thesis in Munich,Germany. The subject was to develop a technique that enables robots to autonomously acquire models of unknown objects, thereby increasing understanding. The robots can became experts in their respective environment and share information with other robots. The main activities were to obtain the object model using OpenCV/C++ Library in Ubuntu operational system, model acquisition for textured objects based on the data collected with the Kinect sensor, image processing, objects detection and recognition. I interacted with people from all over the world working 8 hours/day in labs equipped with the most known robots like PR2. 
           There was the most interesting experience I ever had and this changed my way to see the things in a rapid technological progress. Because of independent life and limited financial resources I returned and started a Business Intelligence career in ISDC, a company specialized in software development, turn-key projects, nearshoring, outsourcing. 
           I have 1 year experience in BI working with Microsoft tools but all this passion comes from my internship experience in Tenaris,  a leading supplier of tubes and related services for the world's energy industry, where I used  LMS (Learning Management System) , a SAP (Systems,Applications,Products) platform for updating the employee database. I was also involved in a company training at Yonder, where I established the bases of SAP ERP (Enterprise Resource Planning) working in SAP Netwear IDES ECC 6.0. I have a SAP certification for ABAP workbench fundamentals and concepts. 

           As Business Intelligence developer I was involved in 3 projects: 
1. A partial development of a BI solution for the Rabobank International. In this project we used the Kimball star schema methodology to design the data warehouse. The chosen technology to implement all functionality was Microsoft SQL Server 2008 as database server, SQL Server Intergration Service as ETL tools and C# as programming language. 

The data warehouse was the central storage area in the system to store trade transaction information with complete history and store it in output database ready for exporting to the external systems. Along with the functional tables, the data warehouse had a set of utility tables to store configuration parameters, audit logs, error logs and user preferences.The Core database was designed using Data Vault ( a method of designing a database to provide historical storage of data coming in from multiple operational systems with complete tracing of where all the data in the database came from). The solution received in an input folder a set of files and whenever a file arrives it is extracted, transformed and loaded in databases. After extraction all input files are archived in an archive folder. The output files are generated when all expected files arrives and ETL process generates messages which are stored into a log table. The system is using three databases for main data storage and  two auxiliary databasesStaging: here the data suffer basic validations and transformations, Core: the central repository of the data (like a datawarehouse), Output: here the data in stored in the format expected to be in the output files. Configuration: store all parameters and  other configuration data on which the system is based. AuditAndLog: store all information regarding auditing and also all the messages which the processes generate.
The process view is  implemented by means of six interdependent processes: F2S: File to staging: Loads the data from the input files to the Staging DB, S2C: Staging to Core: Loads the data from the Staging DB to the Core DB, C2O: Core to Output: Loads the data from the Core DB to the Output DB and latter to the output files. LOG: Log messages produced to external files in a format required by the consumer systems of those file. CFA: Check File Arrival: Is checking if the expected files arrived or not, and notify if a file is not arrived in the specified time-window. PURGE: It purges old data from the  databases.


2. The second project was the development of specific modules using SQL Server 2012 for the internal BI project used to develop and monitoring the structure of the company. The source data is collected from Operational Systems (iTimeTrack, CRM, SmartOffice) and from csv files (Costs, Budgets) into a staging area. From the staging area, the data moves into a Data Vault in order to record the history of changes and moves into the Data Marts area which prepares the data for reports. The main tasks were to create Audit database structure (contains tables that are logging all the events occurring in the system), staging area database structure (SQL dynamic scripts to generate tables, primary keys, indexes and views to concatenate the table columns), Data Vault database structure (SQL scripts used to generate Hubs, Satellites and Links) and Data Mart database structure (SQL scripts to generate dimensions and facts tables), create PITs and bridge tables to improve performance of the queries, analyze the full and delta load for each table, ETL Integration: stored procedures to import the data from source to staging area, from staging area to Data Vault and from Data Vault to Data Mart, create linked servers,create SSIS packages used to import the CSV source files, package configuration and deploy, use PowerView from SharePoint to create reports. 

3. Over 5 months I was involved in a maintenance BI project of a mix of applications and technologies that are used to gather, provide access to and analyse data and information about company operations. The client was Bovemij Verzekeringen, an insurance company from Netherlands. The main tasks I had provide me with the ability to resolve database administration issues and recognize the connections between situations. There was issues like: cube deployment errors, wrong mapping of columns in cube, update links with the right values, set the protection level in order to get package access, implement business logic - create a job running if another one is correctly processed, run the job if more than 1 business day is past- add rights to users, move the project on TFS, upgrade the BI solution in order to bring performance, robustness and a better platform to improve the solution, implement a purge process to adjust the space allocation for files, code review and research in reducing SQL Server deadlocks, implement Backup and Recovery planning. 

Each project developed has an updated technical design and system architecture. 
I spent time for Querying SQL Server 2012 certification in order to be able to write complex queries. I learned how to improve the SQL performance using SQL Server Profiler and tuning advisor. 

Last update of my profile: September 2012 


View Monica Opris's profile on LinkedIn

No comments:

Post a Comment