Data Architect Sample Resume Format in Word Free Download -->

Data Architect Sample Resume Format in Word Free Download

Sample Template Example of Beautiful Excellent Professional Curriculum Vitae / Resume / CV Format with Career Objective, Job Description, Skills & Work Experience for Freshers & Experienced in Word / Doc / Pdf Free Download





Rimiton Blake



 

 

SUMMARY


Information Systems professional with broad experience in Information Technology.  Strong background in Information/Data Architecture, Data Modeling and Systems Development. Seeking a position where I can use my vast experience and insight to help guide an Enterprise Information Architecture to success.  I can offer guidance from inception to deployment, from requirements to development, and always from a very pragmatic perspective with proven hands-on ability.




Professional Qualities and Skills

q  Data Architect
q  Data Modeler
q  BA / Requirements Gathering
q  Reference Data / Master Data Management and Distribution
q  Data warehousing specialist (Kimball methodology)
q  Very large-database (VLDB)
q  High-performance database (Oracle Exadata, Vertica, Sybase IQ)
q  Extensive OLAP and OLTP experience
q  DBA
q  Systems Developer / Systems Background / Systems Outlook

q  Excellent problem solver / debugging
q  Understand concepts and details readily
q  Learn new skills quickly
q  Make connections between existing knowledge and new information
q  Draw on broad and deep experience
q  Comfortable using technology as well as presentation to management and cross-functional team members
q  Skilled at optimizing processes and data for better performance and resource utilization
q  Enjoy and successful at mentoring others


q  DBMSs:         Oracle 6-11g RAC; Oracle Exadata; SQL Server ; Sybase; Sybase IQ; Vertica
q  Repositories:  Golden Source;  Model Mart
q  Platforms:      PC; Sun Workstation; Apollo Workstation; IBM Mainframe
q  OS:               Windows; DOS; Unix/Solaris; AIX; Linux; VM/CMS; MVS; Aegis
q  Languages:    SQL;  PL/SQL (Oracle);  TOAD; erWIN; T/SQL (SQL Server / Sybase);  Visual Basic;
         C; Pascal; SQL*Developer (Raptor); NOMAD; Assembler; Unix shell;  PC/DOS shell;
         XML, XDB (Oracle’s XML object/relational framework)



Work History


May 2011 – present                                    Screen PC Inc, Westport:                        Data Architect
·         Oracle backend for “set top” start-up venture
·         OLTP and OLAP data modeling
·         Performance tuning
·         Very Large Database (VLDB) design
·         Oracle, PL/SQL, SQL, Java interface



April 2010 – March 2011               Citibank, NYC/NJ:                                      Data Architect
·         OLTP data modeling
·         Performance tuning
·         Very Large Database (VLDB) design
·         PL/SQL packages
·         Interface development between Java and Oracle RAC
·         All projects were part of Citi’s premier, real-time Cash Management System named CitiDirect

q  Audit Platform
o    Designed complete replacement of database model and processing for tracking Audit Events
o    Audit Event information is passed via Tibco queue in XML format from over 50 subsystems.  XML is shredded in Tibco server and passed in array objects to an Oracle package for persistence.  The choice of this approach was based on results of multiple proofs-of-concept (POCs) I designed and built, as well as cost of scalability and contention for shared resources.
o    Volumetrics:
·         Throughput SLA of 50 events/second during business hours
·         Each event averaging 35 attributes
·         Bulk import-file processing causing upwards of 150,000 events within minutes.  No “lag SLA” was defined, but the system had to support processing these bulk events readily.
·         2-years’ of Audit Events online for real-time inquiry, reporting and viewing
·         Approximately 1-3 billion Audit Events
·         Approximately 25-100 billion details!!       (each one is a database record)
o    For these volumes, unit performance and disk space utilization were critical.
o    POCs were used to determine viability (I designed and built all of these):
1.     Receive XML as a string, store it as a CLOB, and shred into relational tables using Oracle’s XPath parsing upon retrieval.
2.     Receive XML as a string, shred into relational tables using Oracle’s XPath parsing upon receipt.
3.     Register the XML schema (XSD) with Oracle’s XDB functionality, receive XML as a string, insert XML directly into object-relational table (i.e. structured storage created during XSD registration), Oracle performs automatic shredding during insert based on XSD, allowing retrieval from structured storage objects (no longer XML).
·         Interesting results obtained:
o    Initial performance during stress testing was borderline acceptable (75/sec)
o    Dwindling to 2 events/second after 150,000 events were processed
·         Was unable to find workaround or explanation for this in the timeframe available.
4.     Shred the XML in Tibco server and pass arrays-of-struct to Oracle.
All designs supported suitable scalability and concurrency.



q  Automated File and Report Delivery (AFRD)
o    Transmission of disk files, typically via FTP or email, on a scheduled, automated basis.
o    Built data model for this.
o    The heart of this system is a “schedule”, which is made up of “timing” information (i.e. when to deliver) and multiple objects to deliver (i.e. what, where and how, including security and logging information).
o    The objects+ were called “linkages” consisting of an entity, security info and delivery info.
o    The log info was modeled simply as a recursive relationship to a “linkage”, since it shared the attributes required for delivering something.

q  Inquiry Module Performance Tuning
o    Standard Inquiry UI (for Transaction, Account Statement and Balance Summaries) plus app-server were making repeated calls to one another and then to the DB layer for different analyses of the same underlying set of user-specified data from a data warehouse.  Most of these calls were performed sequentially after the user entered her/his filter criteria and before returning control to the user.
o    Minimized the number of round-trips from UI to database!  This required convincing Java Platform team to build multiple-list capability in response message from app-server to UI.
o    The solution was to use an ANSI SQL extension called Subquery Factoring.
o    This solution saved significant time, adding no more than 20% of time beyond the original complex query, even though 6 or 7 comparable queries against the same result-set were processed.

q  MIS Reporting
o    This was a data mining project.
o    The goal was to identify Citibank Clients and group them together based on their usage of various functionalities within the CitiDirect Cash Management product.
o    The purpose of this was to better understand how to roll out new versions of this product, targeting sets of Clients by the functionality they used.  This also allowed the development schedule/path to be modified to accommodate large groups of Clients in many countries so more Clients would benefit from each new release of the product



Oct 2007 – April 2010         Standard & Poor’s, NYC:             Director, Information Architecture,
Data Services
·         Responsible for Data Modeling and Design across S&P’s business units, including Equities, Fixed Income (Corporate Bonds, Treasuries, Sovereigns), Structured Finance (RMBS, CMBS, CDO(2), ABS), Derivatives (CDS - Credit Default Swaps), Foreign Exchange, as well as Master Data (Prices, FX, Org Structure (D&B,etc.), CUSIP, etc.
·         Responsible for Data Integration Architecture
·         Led onshore and offshore data teams
·         Led onshore and offshore development teams
·         Subject Matter Expert (SME) resource within project teams for Data and Architecture
·         Integrated Golden Source reference/master data into operational systems via Pub/Sub, Replication and SOA.

q  Integrated Feed Platform
This Oracle RAC system will offer seamless creation of data feeds that can be dynamically defined by end-users, choosing among attributes that cross data domains, including reference (from Golden Source), descriptive, rating, detail, roles and performance data.
o    Performed high-performance database analyses and comparison among Oracle Exadata, Vertica, Sybase RAP/IQ, Terradata
·         Oracle Exadata was chosen because:
·         Exadata V2 is really grid computing, properly pre-configured, with all the best practices ready to deploy.
·         SmartScan gives high throughput using both disk-level column-projection and disk-level predicate-evaluation
·         Easy to leverage existing in-house and vendor support for administration, development and maintenance.  Legacy “Oracle knowledge” is preserved.
·         Block-level, column-oriented storage was not significant.
o    Data Modeling
o    Design
o    Metadata architecture

q  Global Rating System
This new Oracle RAC system is used by S&P as a common platform for rating Insurance Companies, Industrials and Financial Institutions initially, with additional Practices added over time.
The initial volumetrics are:
·         40 billion rows of fundamental financial data (increasing by 6 billion / year)
·         10-12 terabytes of data
·         100’s of users
·         Real-time response around the world
·         Over a dozen data sources with the ability to easily add more
o    Including Compustat, Bloomberg, CapIQ, Edgar, FDIC, Reuters, INTEX, Toyo Keizai, NAIC, etc. plus proprietary, internal S&P data feeds.
o    Supplied historical ratings, financials (P&L, balance sheet, etc.), market data and pricing, regulatory information, FX conversion, economic data, etc.
o    Over 45,000 individual data elements
o    Ability to predefine and automate choice of data source for each data element by geography and industry of the organization.
o    Expressions able to be defined based on other data elements
o    Arrays were supported to partition a single data element into business segments (e.g. Total Premiums Written by state by line-of-business).
·         Dozens of predefined, analytical, Excel-like templates
o    Performed data requirements gathering.
o    Data Modeling
o    ETL and server architecture, including Golden Source master data via Pub/Sub, Replication and SOA.
o    Architected Oracle Data Guard, Oracle Change Data Capture (CDC) and Flashback Data Archive (“Oracle Total Recall”) for replication needs.
o    Performance enhancement
o    Design
o    Development of complex server-side code
o    Leading others in the use of the data model
o    Debugging


q  Master Data Foundation (Reference Data) and Data Services Framework Architecture
S&P maintains its master data in a federated configuration for business subject areas (typically organized by asset class and region) with a consolidated architecture for reference, descriptive, organization, instrument, role and linking information.  Some of this is in Golden Source.
o    Responsible for tuning, incorporating new subject areas, new feeds (both internal and external), and sharing/teaching our Enterprise approach (EII) to business units and their initiatives.
o    The Framework allows consistent, reusable approach to data distribution (ETL, pub/sub, SOA), stewardship, maintenance.
o    Data models, as well as ETL, are shared and leveraged to allow quick time-to-initiation for new development, including actual data (not dummy data) for early performance and QA testing.


q  Structured Finance Consolidation and Enhancement
o    Oversaw design to accommodate all deal structures (RMBS, CMBS, ABS, CDO, etc.)
o    Performance enhancement
o    Oracle implementation
o    Included varied and new performance measures at loan-level



2006 – 2007  Systems and Software Inc:         DBA / Data Modeler
Systems & Software Inc’s premier product, enQuesta, provides a scalable and configurable software solution to municipal and investor-owned utilities, supporting electric, water and gas utilities with the following modules:
Account Management
Utility Billing
Automated Workflow
Revenue Management
Credit & Collections
Device Inventory Management
Financial Management
Payment Processing

Duties included:
o    Oracle performance enhancement and debugging.
o    Oracle backup via export, tablespace-transport and RMAN.
o    Oracle database installation and replication.
o    Data modeling, ETL and interfaces with Java
o    Liaison with Business Intelligence Group (Cognos) on data model, materialized views, BI needs, etc.


1995 – 2006    Pfizer, Inc., NYC:                                      Data Architect / DBA / Warehouse Specialist
q  Business Technology and Information Systems Department

q  Therapeutic Class System
o    Market Research data warehouse system.
o    Sole architect and designer and built this system entirely from the ground up.
o    This market share information was used in extremely complex, proprietary, statistical algorithms and analyze product trending behavior.  This allowed sales representatives to identify areas where they could best concentrate their efforts.
o    Designed the entire system to be table-driven to allow extensibility and performance optimization.
o    Responsible for managing, maintaining and extending this system, and interacting with end-users.
o    Oracle 9i database (SQL*Loader, Oracle Report Writer 6i, SQL*Plus, SQL, PL/SQL scripts/packages)

q  Incentive Compensation ETL System
o    Edit/transform/load data warehouse system.
o    Sole architect and designer and built this system entirely from the ground up.
o    Solely responsible for managing, maintaining and extending this system
o    Heavy interaction with the end-users.
o    Designed the entire system to be table-driven with reusable, modular components to allow simple extensibility and performance optimization.  This allowed it to also be used for the Therapeutic Class System described above.

q  Starter Tracking System
o    OLTP and reporting system for “Free Samples” which sales representatives give to doctors.
o    Inherited this fairly well-designed system, added optimization and converted as much as possible to table-driven structures.
o    Solely responsible for managing, maintaining and extending this system, and interacting with the end-users.
o    NOMAD database management system

1994                           Greenwich Capital Markets, CT:                                    DBA / Database Developer
q  Human Resources: general reporting, development and maintenance
q  Inherited this HR system and extended its functionality while adding controls and optimization.
q  Managed one other person
q  Unix operating system (shell scripts)
q  Sybase / SQL Server database (ISQL, SQL, T/SQL scripts, triggers, stored objects)


1993                           American Express, NYC  - Consulting:            DBA / Database Developer
q  Financial consolidation project
q  Member of a team using Nomad as a front-end to DB2 writing reports against financial information
q  Wrote many BI analysis queries
q  Designed consistent report format for all queries


1990 – 1993              A.C. Nielsen Coupon Redemption, TX:            System Developer
q  Project for A.C. Nielsen for coupon-redemption on Texas/Mexico border
q  Built a real-time PC-based system used by thousands of PCs hooked up to about 10 IBM midrange servers that contained the "dimension" or "master" data needed for all lookups and received final results.
q  This was a classic 3-tier architecture using a dispatcher process on one of the servers to pass messages between the PCs and the remaining servers.
q  Had technical oversight for group of Nielsen employees performing backroom data processing.
q  DOS operating system on PCs written in “C”







Education

Columbia University / Columbia College:                    Political Science
City University of New York:                                         Computer Science



Download Resume Format 

0 comments:

Post a Comment

Resumes By Categories

1-5 Years Experienced CV 10-15 Years Experienced CV 10th Standard 12th Standard CV 15-20 Years Experienced CV 20-25 Years Experienced CV 30+ Years Experienced CV 5-10 Years Experienced CV Accounts CV Aeronautical CV Automobile Engineer B Com CV B Pharma CV B.Tech CV BA CV BA Philosophy CV Banking CV BBA CV BBM CV BCA CV BDS CV Beautiful CV Bio Technology CV BMS CV BPO Call Center CV BSC CV BSC IT CV Business Analyst CV CA CV Cashier CV CEO CV CFA CV Chemical CV Civil Engineering CV CMA CV Commercial CV Cook CV Cover Letter for Resume CS CV D Pharma CV Diploma CV Doctor CV draff Economist CV Electrical CV Electronics CV Engineer CV Fashion Designer CV Films CV Finance CV Foreign Resume Format Fresher CV GM CV Hotel Management Housekeeping CV HR CV ICWAI CV Import Export CV Industrial Engineering CV Instrumentation CV Insurance CV IT CV ITI CV journalist CV LLB CV M Com CV M Pharma CV M Phil CV M Sc Computer Science CV M Tech CV M.Tech MA CV Manager CV Marketing CV Mass Communication CV MBA CV MBA Event Management MBA Finance MBA Hospitality CV MBA HR CV MBA IB cv MBA Marketing MBA Production MBA Quality MBA SCM CV MBA System MCA CV Mechanical CV Medical Representative CV Mining CV MMS CV MS CV MSC CV Over 25 Years Experienced CV Paint Technology CV Pharmacy CV Phd CV Philosophy CV Project Manager CV Psychology CV Purchase CV Quality Engineer Real Estate CV SAP ABAP CV SAP Basis SAP BI CV SAP BO CV SAP CRM CV SAP CV SAP FICO CV SAP HR CV SAP MM CV SAP PP CV SAP SD CV SAP Security Six Sigma CV Special Resume System Administrator CV Teacher CV Textile CV Treasury CV USA Resume Web Designer CV