Sample Template Example of Beautiful Excellent Professional Curriculum Vitae / Resume / CV Format with Career Objective, Job Description, Skills & Work Experience for Freshers & Experienced in Word / Doc / Pdf Free Download
Download Resume Format
|
Somani Sharma
(: 366-910-5021
Email: somani.sharma21@yahoo.com
§
Over 7 years of IT
experience in analysis, design, development, testing, maintenance and
implementation of complex Data Warehousing applications using various ETL tool
like Informatica and data bases like Oracle, SQL Server 2005 on windows and UNIX environments.
§ Over
6 Years Strong experience in the Implementation of Extraction, Transformation
& Loading (ETL) Life Cycle using Informatica
Power Center/ Power Exchange v9.0.1/8.6.1/7.x
§ Extensively
worked on the PL-SQL in Oracle
10g/9i/8i, SQL Server, DB2,Teradata
as RDBMS with Procedures, Functions, Triggers and SQL * Plus.
§ Experience
in Data masking of sensitive elements
using ILM (Information Lifecycle Management)
§ Experience
with handling Informatica Designer, Informatica Workflow Manager, Workflow
monitor and Repository Manager.
§ Great
experience in Identification of User requirements, System Design, writing
Program specifications, Coding and implementation of the Systems.
§ Worked
on data analysis and profiling for source and target systems and good knowledge
of Data Warehousing concepts, staging tables, Dimensions, Facts and Star
Schema.
§ Expertise
in developing and running Mappings, Sessions/tasks, Workflows, Worklets and Batch
processes on Informatica server.
§ Involved in Database design, entity
relationship modeling and dimensional modeling using Star and Snowflake
schemas.
§ Have experience of modeling tool using Erwin 7.x/4.x Tools
§ Extensively worked with mappings using
different transformations like Filter, Joiner, Router, Source Qualifier,
Expression, Normalizer, Union, Update Strategy, Unconnected / Connected Lookup,
Aggregator and SCD – 1,2&3
§ Good Experience of Change Data Capture
(CDC)
§ Great Experience in tuning
Mappings and Sessions for better Performance.
§ Experience
of unit testing of each mapping what we developed.
§ Extensively
worked on creating, executing test cases
and test scripts using Manual/Automated methods.
§ Worked in Production support team for maintaining the mappings,
sessions and workflows to load the data in Data warehouse.
§ Excellent
technical and analytical skills with clear understanding of design goals of ER
modeling for OLTP and dimension modeling for OLAP.
§ Experience in UNIX Shell
Scripting.
§ Sound
theoretical and practical background in the Principles of Operating systems and
Multi-threaded applications.
§ Highly
motivated with the ability to work effectively in teams as well as
independently.
§ Experience
in integration of various data sources like Oracle, DB2, Teradata, SQL Server,
MS Access, XML and Flat files into the Staging Area.
§ Effectively
tuned ETL frameworks with hash partitioning, sorting algorithms for performance
and scalability
§ Strong
knowledge in other ETL tools Datastage 8.1, Talend 4.2 and Working knowledge on Reporting tool Cognos
§ Extensively
used TOAD 9.0/8.5 to access Oracle database and Control center to access DB2
Database
Education
:
|
Bachelors
in Computer Science.
TECHNICAL
SKILLS :
|
OPERATING SYSTEMS
|
Windows XP
Professional, Server 2003, HP UNIX, Linux
|
ETL TOOL
|
Informatica Power Center v9.x/8.x/7.x, Datastage
8.1, Talend 4.2, SQL*Loader, ILM(Information Lifecycle Management)
|
DIMENSIONAL DATA MODELING
|
Data Modeling, Star Schema Modeling, Snow-Flake
Modeling, FACT and Dimensions tables, physical and logical data modeling,
ERwin 4.1.2/3.x
and Oracle Designer
|
REPORTING TOOLS
|
Congnos 8.1
|
LANGUAGES
|
C, C++, PL/SQL, SQL,
|
SCRIPTING LANGUAGES
|
Java Script, UNIX Shell,
|
DATABASES
|
Oracle
9i/10g/11g, SQL Server 2005/2008, MS-Access
and DB2,
Teradata, PL/SQL
|
OTHER TOOLS
|
Autosys,Toad, Putty, Tel Net, WinSCP, Erwin 7.x/4.x, MS Power Point, Visio, Remedy and Share
Point, Mercury Quality Center, Version Control tool,
SQL Developer, PVCS, CAD 2000/02/05/CAM, ANISYS 5.4
|
PROFESSIONAL
EXPERIENCE:
|
ORGANIZATION
|
DESIGNATION
|
DURATION
|
Wipro
Technologies, Atlanta,GA
|
Sr. Informatica Developer
|
10/2011
- Present
|
MyBuys, Ann arbor, MI
|
Sr. Informatica Consultant/Developer
|
02/2011-10/2011
|
San Mateo county, CA
|
Sr. Informatica Developer
|
07/2008 - 04/2010
|
Volkswagen,
MI
|
Data Warehouse Consultant
|
09/2007 - 06/2008
|
Global Wireless Solutions, VA
|
ETL Developer
|
11/2006 - 08/2007
|
Hucon Solutions, Bangalore, INDIA
|
SQL Consultant
|
05/2005 - 09/2006
|
PROJECT/WORK EXPERIENCE:
|
WIPRO TECHNOLOGIES,
Atlanta, GA 10/2011-Present
Client: CVS
Pharmacy
Sr. Informatica Developer
CVS
Caremark has over 250 applications, which have production data in
non-production environments. Various compliance and audit requests mandates the
masking of the production data in the non-production environments
(SIT,DEV,STP).Wipro is responsible for masking sensitive data in 160
applications using Informatica/ILM tool to meet the compliance standards.
§ Involved in analysis, requirements gathering,
function/technical specifications and development, deploying and testing.
§ Prepared
LLDs based on the HLDs to meet the business requirements
§ Created
Informatica Mappings to load data using transformations like Source Qualifier,
Aggregator, Expression, Router, Union, Joiner, Connected and Unconnected
lookups, Filters, sequence generator and Update Strategy.
§ Used
parallel processing capabilities, Session-Partitioning and Target Table
partitioning utilities.
§ Extracted
source data using Power exchange from IMS DB.
§ Used
the Debugger in debugging some critical mappings to check the data flow from
instance to instance.
§ Developed
the mapping to pull the information from different tables and used SQL Override
to join the tables instead of Joiner
transformation to improve the performance
§ Created
parameter files in UNIX to run the workflows.
§ Scheduled
sessions to update the target data using Workflow Manager.
§ Developed
all the mappings according to the design document and mapping specs provided
and performed unit testing.
§ Reviewed
and validated the ETL mappings and data samples loaded in the test environment
for data validation.
§ Incorporated
policy/rule/plan for sensitive elements to be masked using ILM (Information
life cycle management) tool
§ Performed
tuning of Informatica sessions by implementing database partitioning,
increasing block size, data cache size, sequence buffer length with the help of
DBA’s, target based commit interval and SQL overrides.
§ Performed
data validation, reconciliation and error handling in the load process.
§ Involved
in Code migration from Informatica 8.6.1 to Informatica 9.0.1
§ Performed
unit testing to validate the data loads in different environments
§ Resolved
the defects and updated in quality center which are raised by the QA team.
§ Provided
support for daily and weekly batch loads.
Environment: Informatica Power Center 9.0.1/8.6.1, Power exchange 9.0, ILM,
TOAD, PL/SQL, Flat files, IMS DB, Oracle 10g, DB2, UDB, Mainframes, UNIX.
Mybuys, Ann arbor, MI
02/2011-10/2011
Sr. Informatica Consultant/Developer
MyBuys has been
recognized in the 30th Annual Inc. 500 ranking of the fastest growing private
companies in the country. Inc. magazine ranked MyBuys #114 overall and #16
within the Advertising and Marketing category with sales growth of 2,366% over
the past three years.
Responsibilities:
§ Involved
in all phases of PDLC from requirement,
design, development, testing, training and rollout to the field user and
support for production environment.
§ Analyzed
the Performed Data profiling on source systems data and the current reports at
the client side to gather the requirements for the Design inception
§ Prepared
the High Level design documents like TAD
and LLD documents like ETL specification documents.
§ Analyzed
Logical data Models and Forward Engineering the Physical data models using
Erwin tool and execute them in DEV environment
§ Experience
with all the new features in Informatica 8.6.1 and migrated all the jobs from 7x
to 8x
§ Designed
and modified more than 20+ jobs for Informatica ETL components
§ Experience
in modifying existing job according to client requirement and loading data into
staging tables.
§ Designed
Jobs Using complex Stages like Web Services Client (source), XML Input Stage,
MQ Series Stage, Complex Flat File Stage, and Hashed file Stage.
§ Involved
and worked in staging area and supporting and developing
§ Involved
in designing relational models for ODS and data marts using Kimball methodology
§ Extracted
and transformed data from high volume data sets of delimited files
and relational sources to load into target Database
§ Used
parameters and variables extensively in
all the mappings, sessions and workflows
for easier code modification and maintenance
§ Analyzed
existing SQL queries, tables and indexes for performance tuning and advised
based on the loading time.
§ Effectively
used error handling logic mechanism for data and process errors
§ Performance
tuning has been done to increase the through put for both mapping and session
level for large data files by increasing target based commit interval.
§ Prepared
Unit test reports and executing the Unit testing queries.
§ Supporting
the UAT and fixing the issues raised in QA
§ Providing
post production support for the project
Environment:
Informatica 8.6.1, Oracle
10g, WinScp, putty, TOAD, SQL Developer, TortoiseSVN.
San Mateo county, CA 07/2008 - 04/2010
Sr. Informatica Developer
County
of San Mateo is hosting a series of public service in health, child support and
Agriculture sector. An overall project is currently underway to removing
redundancy, Data loading and maintaining of health and clinical data.
Responsibilities:
§ Involved
in all phases of PDLC from requirement,
design, development, testing, training and rollout to the field user and
support for production environment.
§ Analyzed
the Performed Data profiling on source systems data and the current reports at
the client side to gather the requirements for the Design inception
§ Prepared
the High Level design documents like TAD
and LLD documents like ETL specification documents.
§ Analyzed
Logical data Models and Forward Engineering the Physical data models using
Erwin tool and execute them in DEV environment
§ Experience
with all the new features in Informatica 8.6.1 and
migrated all the jobs from 7.x to 8.x
§ Migrated
Jobs from 7.x to 8.x tested all the jobs and moved them to production making
all the required changes.
§ Modified
all the DS Parameters from production to development to test the jobs before
migrating to production in 8.6.1
§ Worked
extensively with Complex Flat file stage to load the data from oracle to
Mainframes.
§ Designed
parallel jobs using CFF (Complex Flat File), Oracle enterprise, Teradata
Enterprise, Teradata multi load, XML Input, XML Output stages.
§ Designed
Jobs Using complex Stages like Web Services Client (source), XML Input Stage,
MQ Series Stage, Complex Flat File Stage, and Hashed file Stage.
§ Involved
and worked in staging area and supporting and developing
§ Involved
in designing relational models for ODS and data marts using Kimball methodology
§ Extracted
and transformed data from high volume data sets of delimited files
and relational sources to load into target Database
§ Used
parameters and variables extensively in
all the mappings, sessions and workflows
for easier code modification and maintenance
§ Analyzed
existing sql queries, tables and indexes for performance tuning and advised
based on the loading time.
§ Created
Pipeline session partitions for concurrent loading of data and to optimize the
performance in loading target tables
§ Effectively
used error handling logic mechanism for data and process errors
§ Performance
tuning has been done to increase the through put for both mapping and session
level for large data files by increasing target based commit interval.
§ Executed
perl and Unix Shell scripts to automate workflows and to populate parameter
files
§ Prepared
Unit test reports and executing the Unit testing queries.
§ Supporting
the UAT and fixing the issues raised in QA
§ Involved
in understanding of Business Processes, grain identification, identification of
dimensions and measures for OLAP applications
§ Providing
post production support for the project
Environment:
Informatica 8.6.1, Windows
Server 2003, DB2, PL/SQL, Shell scripts,
Oracle 10g, SQL Loader, Toad, Putty, EditPlus
Volkswagen, MI 09/2007 - 06/2008
Data
Warehouse Consultant
Volkswagen manufactures vehicles in
Germany and Mexico. And it distributes it to dealers all over US and Canada.
All the information pertaining to dealers such as dealer information, warranty
information, labor information etc. is stored in the data warehouse. Data Mart is a Star Schema Data Repository,
maintained in an integrated Oracle environment optimized for reporting. The
environment extracts data from heterogeneous source systems and integrates Contracts
and Sales Transactions. It improves the
analytical capability of internal Risk Management (Contract Reviews) teams.
Responsibilities:
§ Analyzed HLD documents and data Models
§ Created LLD documents of ETL mapping document which gives detail
information about source to target mapping and business rules implementation.
§ Designed, developed and debugged ETL mappings using Informatica
designer tool.
§ Created complex mappings using Expression, Joiner, Filter,
Sequence, Connected & Unconnected Lookup, Router and Update Strategy
transformations using Informatica designer.
§ Extensively used ETL to load data from different sources such as
flat files, XML to Oracle.
§ Implemented slowly changing dimension for accessing the full
history of accounts and transaction information.
§ Tuned and monitored in Informatica workflows using Informatica
workflow manager and workflow monitor tools.
§ Created and configured workflows, worklets and sessions using
Informatica workflow manager.
§ Created and scheduled sessions and Batch process based on demand,
run on time, run only once using Informatica server manager.
§ Created PL/ SQL procedures to aggregate facts and drop and
recreate indexes during loads.
§ Created mappings for Flat file targets.
§ Determine source, target or mappings bottle necks for slow
mappings.
§ Used Debugger to debug the mappings for inconsistencies and
repository configurations.
§ Performed Query Tuning of Informatica mappings, sessions and
workflows.
§ Created mapping parameters and variables to load the stage and Data
warehouse incrementally.
§ Scheduled the jobs using third-party scheduler tool Autosys.
§ Implemented the National
City standard data population method loading the data
into new set of tables after successful completion of the data loading, rolled
over to original tables. This concept used to high availability of the data to
users at the time of the data loading instead of users to wait to run their
reports.
§ Worked on Primary production support for projects
§ Resolving issues raised in daily loads
§ Worked on CRs to fix the break fixes in Production.
§ Worked on tickets raised by customers.
Environment: Informatica
Power Center 7.1, Oracle 9i, Toad, Erwin4.x, UNIX, and Windows
Global Wireless Solutions, VA 11/2006 -
08/2007
ETL Developer
Global wireless solutions are a leading
independent benchmarking solutions vendor for the wireless industry. Here,
benchmarking refers to the process of comparing one operator’s network
delivered quality against competitors and comparing network performance between
markets. It also provides unique bench marking solutions for largest wireless
carriers worldwide. My main job was to use data marts as the source data where
each data mart belongs to a client and work on the ETL process using Informatica.
Responsibilities:
§ Designed, developed and debugged ETL mappings using Informatica
designer tool.
§ Migrated mappings, sessions, and workflows from Development to
Testing and then to Production environments.
§ Performed unit, integration and system level performance testing.
Associated with production support team in various performance related issues.
§ Provided production support by monitoring the processes running
daily.
§ Created complex mappings using Aggregator, Expression, Joiner,
Filter, Sequence, Procedure, Connected & Unconnected Lookup, Filter and
Update Strategy transformations using Informatica Power center designer.
§ Extensively used ETL to load data from different sources such as
flat files, XML to Oracle.
§ Worked on mapping parameters and variables for the calculations
done in aggregator transformation.
§ Implemented slowly changing dimension for accessing the full
history of accounts and transaction information.
§ Tuned and monitored in Informatica workflows using Informatica
workflow manager and workflow monitor tools.
§ Created, Scheduled and configured workflows, worklets and sessions
using Informatica workflow manager.
§ Created various geographical reports like Reports by Policy,
Reports by Customer, Reports by Period, Demographic reports and Comparative
reports.
§ Involved in writing project documentation using Microsoft VISIO
for different diagrams.
§ Unit testing and Migration code into all environments.
Environment: Informatica Power Center 7.1, Oracle 9i, Toad, Erwin4.x, UNIX, and
Windows
Hucon Solutions, Bangalore, INDIA
05/2005 - 09/2006
SQL Consultant
Amgen-CCS Data warehouse collect, denormalize and integrate data
from its operational database, as well as third-party hospital, pharmacies and
physician patient’s reports to provide critical analytical data for decision makers.
PL/SQL is used to extract, transform and load the data from staging area built
through Oracle 8i snapshot replication of operational database into the data
warehouse implemented in Oracle 8i.
Responsibilities:
§ Worked with analysts and data source systems experts to map
requirements to ETL code.
§ Responsible for implementing data integration from source systems
into Oracle data marts using Stored procedures, Functions and Triggers.
§ Applied business and application knowledge to design the data
loads for consistency and integrity.
§ Worked with production load standards and error handling. Worked
with IMS Data to validate Sample History module data. Assisted in performance
tuning by running test runs.
§ Created the transformation routines to transform and load the
data. Tuned SQL queries for better performance.
§ Unit testing the code and migration of code from Dev to QA and
Prod
§ Post production support of the application.
Environment: Oracle 8i,
PL/SQL, Erwin, TOAD, MS Visio, UNIX-HP, SQL*Loader,
Windows NT
Download Resume Format
0 comments:
Post a Comment