Breadcrumb

Content Marked with: Systems and Solutions: Enterprise Infrastructure, Data, Research Computing and Support

Enhanced Data Integration and Documentation: UCSF-SFDPH DeID OMOP Database

Body
The UCSF-SFDPH DeID OMOP database integrates EHR data from UCSF Health and SFDPH. UCSF Health primarily serves insured patients, while SFDPH provides accessible healthcare regardless of insurance or immigration status. This integration enhances research in population health and health equity by offering a unified patient database with broad demographic and socioeconomic representation. Our team developed detailed methodologies and transparent data mapping processes, including a global patient identifier and extension tables for health system data differentiation.

Migrating an On-Prem Learning Management System to AWS with Infrastructure as Code

Body
In this session, we will share our experience migrating our Learning Management System from on-premises to AWS. We will walk through designing a scalable and secure cloud infrastructure using CloudFormation, automating deployments with AWS CodePipeline, orchestrating containers with ECS Fargate, and managing databases with Amazon RDS. The migration process involved moving code via GitHub, files with AWS DataSync, and the database through DBMS export/import.

Navigating the World of E-Form Solutions

Body

This session will go over the strengths/weaknesses of three popular form-building platforms: MachForm, Qualtrics, and DocuSign's Powerform. The session will explain and demonstrate the platforms, common use-case scenarios for each, and why some should be used over others (for inherent reasons, as well as user/security-requirement reasons).

From SSIS to DBT: A Modern Approach

Body
The previous SSIS/Python ETL was viewed as antiquated, disjointed and problematic for maintenance and reliability. A modern embedded data solution centered around a user defined data mart which would act as the catalyst for driving informed decision making and supporting prioritized initiatives from executive leaders was seen as the solution. There were two sets of goals, create a self-service data layer for ad hoc questions, a recurring reporting solution and support KPI metric dashboards.

Up and downs to achieve a successful business continuity plan

Body
This presentation provides insights into how our department created a successful business continuity plan after many years of fits and adjustments. Rounds of evolving user training over the years attempted to transition users to our vision without success until the gaps with users were isolated using virtual machines. This allowed our department to adjust testing scenarios to simulate a disaster and arrive at solutions where our users could use any device (a provided device or BYOD) in a Zero Trust environment and continue doing their work.

UCLA Data Lakehouse in Action

Body
While university data held immense value, its fragmentation limited its utility. We built a modern cloud platform, centralizing data and establishing a single source of truth. This demo will illustrate how we centralized disparate data, creating a unified foundation for actionable insights. Our Agenda: Cloud Migration: Secure, scalable platform integrating diverse data sources into a single source of truth. Automated Pipelines: Collection, Erichment, and Curation ensuring data consistency. Self-Service Access: User-friendly interfaces for all stakeholders.

Modernizing Specimen Management: Overcoming Challenges with the UC Digital Donor Library

Body
The UC Digital Donor Library (DDL) is a system designed to help UC medical school campuses including UCD, UCI, UCLA, UCSD, and UCSF manage every stage of anatomical donations—from donor registration and preparation to inventory tracking, allocation, and final disposition. Originally launched in 2008 as a desktop application, the DDL underwent a major modernization effort in 2023 to a secure Cloud based solution. In this session, we’ll cover: • Program Overview: Overview of DDL and the key challenges faced during implementation. • Lessons Learned: User feedback management and validation.

Sunapsis - An innovation platform to manage international student and scholar services

Body
In this discussion, we will provide an overview of the Sunapsis platform and introduce the background of the Sunapsis project at UCSD. We will then cover the project outline, including system configuration and data integration, as well as the challenges we have overcome. Additionally, we will highlight the accomplishments achieved so far.

Advanced SQL for Analytics

Body
Do you consider yourself okay with SQL but want to level up? This session will cover SQL functions that can be used for analytics. Most of these functions are available in ANSI SQL and thus work on most databases, but some functions will be specific to Oracle, Snowflake, and BigQuery (and perhaps other databases I haven't researched). We'll cover window functions, aggregations, pivoting, statistical functions, and even machine learning capabilities within SQL. Prepare to leave this session excited to impress your analytics stakeholders with more advanced ways of looking at data!

Our Data Lakehouse Architecture in GCP

Body
This talk will explain our Data Lakehouse tool and architecture stack that allows us to rapidly deliver data for AI, analytics, reporting, integrations, and more. We'll cover our use of Invoke for data ingestion, dbt for pipelines, BigQuery as a database, Looker as an analytics tool, and Vertex AI for data science. We'll also cover our medallion-like architecture within the data lakehouse.
Let us help you with your search