+ All Categories
Home > Documents > 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB)...

2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB)...

Date post: 08-Mar-2018
Category:
Upload: truongdien
View: 213 times
Download: 1 times
Share this document with a friend
318
2005 SCEC Annual Meeting September 11-14, 2005 Proceedings and Abstracts Volume XV Riviera Resort and Racquet Club Palm Springs, California
Transcript
Page 1: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

2005 SCEC Annual Meeting

September 11-14, 2005

Proceedings and AbstractsVolume XV

Riviera Resort and Racquet ClubPalm Springs, California

Page 2: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

2

Page 3: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Table of Contents

Section I. SCEC Annual Meeting Agenda . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13D Rupture Dynamics Code Validation Workshop . . . . . . . . . . . . . . . . . 7WGCEP/National Seismic Hazard Map Meeting . . . . . . . . . . . . . . . . . . 8

Section II . 2005 Annual Meeting Participants . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . 9

Section III. SCEC Organization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17SCEC Board of Directors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182005 SCEC Planning Committee. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192005 SCEC Advisory Council . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

Section IV. Tribute to Kei Aki . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

Section V.State of SCEC, 2005 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 292004 Advisory Council Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 CEO Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49Draft 2006 SCEC Program Announcement. . . . . . . . . . . . . . . . . . . . . . . . 64

Section VI.Abstracts for Invited Talks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87ITR Session Abstract . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 932005 SCEC Meeting Abstracts. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95

3

Page 4: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

SECTION I

2005 SCEC ANNUAL MEETING AGENDA

Sunday, September 11

07:00 Continental Breakfast Desert Conference Conference

08:30 Rupture Dynamics Workshop(Chairs: Harris/Archuleta) Mesquite A and B

09:00 Teacher Training Workshop (Chairs: deGroot/Cooper) Mesquite C and D

15:00 Salt Creek Trench Viewing Off Site at Salt Creek (Gordon Seitz and Pat Williams)

15:00- Poster Session Set–Up Mesquite/Oleander

18:00 Icebreaker Reception Mediterranean

20:00 SCEC Advisory Council Executive Session (Chair: Solomon) Board Room

20:00 WGCEP/National Seismic Hazard Map Whitewater(Chair: Chris Wills)

20:00 Poster Session Mesquite and Oleander

1

Page 5: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Monday, September 12

Session I: Chair: Tom Jordan (Grand Ballroom)

07:00 Continental Breakfast Desert ConferenceCenter

08:00 Welcome Tom Jordan

08:10 “Report from NSF Earth Sciences” Kaye Shedlock

08:30 "The USGS/SCEC Partnership" David Applegate/Michael Blanpied

08:50 State of the Center Tom Jordan(Including SCEC3 proposal submission/review)

09:20 State of the CEO Program Mark Benthien

09:40 Break

10:00 “Recent Discoveries from the Taiwan Kuo-Fong MaChelungpu-Fault Drilling Program”

10:30 "Physical properties and multi-scale seismic Naomi Bonessanisotropy in the San Andreas Fault Observatory at Depth"

11:05 “The Parkfield/Landers Reference Earthquakes Digital Library”Brad Aagaard Alexei CzeskisJessica MurrayAnupama VenkataramanGreg Beroza

11:25 Introduction to 2006 Planning Process Ralph Archuleta

11:40 Lunch (Mediterranean Room)

2

Page 6: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Session II: Chair: Ralph Archuleta (Grand Ballroom)

13:00 Earthquake Source Physics Plenary Ruth HarrisDavid Oglesby

14:00 Seismology Plenary John VidalePeter Shearer

15:00 FARM Plenary Terry Tullis/Judi Chester

16:00 Geology Plenary Tom RockwellMike Oskin

17:00 Seismic Hazard Analysis Plenary Ned Field(to include discussion on CSEP) David Jackson

18:30 Cocktails (Mediterranean Room)

18:45 Dinner (Mediterranean Room)

19:30 A Tribute to Kei Aki Tom JordanJohn McRaneyBill Ellsworth

20:00 Poster Session (Mesquite and Oleander Rooms)

20:00 SCEDC Users Meeting (Grand Ball Room) Rob ClaytonVikki Appel

3

Page 7: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Tuesday, September 13

Session III: Chair: Tom Jordan (Grand Ballroom)

07:00 Continental Breakfast Desert Conference Center

08:00 “Overturning of Slender Blocks: Numerical Matt PurvanceInvestigation and Application to Precariously Balanced Rocks in Southern California”

08:30 “Constraining Extreme Ground Motions in Norm AbrahamsonSeismic Hazard Analyses”

09:00 Ground Motion Plenary Paul DavisRob Graves

10:00 Structural Representation Plenary John ShawJeroen Tromp

11:00 Fault Systems Plenary Brad HagerSally McGillJim Dieterich

12:00 Lunch (Mediterranean Room)

4

Page 8: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Session IV: Chair: Ralph Archuleta (Grand Ballroom)

13:30 "Imaging of active seismogenic faults with space geodesy" Yuri Fialko

14:00 "Structural versus Nonstructural Seismic Response to Ground Motion Ensembles" Tara Hutchinson

14:30 Geodesy Plenary Duncan AgnewMark Simons

15:30 Implementation Interface Plenary Paul SomervilleRob Wesson

16:30 SCEC/CME Plenary Phil Maechling

Transforming Seismic Hazard Analysis Ned FieldRobert Graves

– Impact of OpenSHA on SHA – Building the Next Generation SHA Tools: CyberShake

The TeraShake Earthquake Simulation Platform Bernard MinsterKim Olsen

– TeraShake 1 Simulations and Results–TeraShake 2 Simulations and Results

SCEC Scientific Workflow Tools Yolanda Gil

– Using Knowledge Tools to Assist Workflow Planning– Grid-based Workflow Tools

Next Generation Techniques David O’Hallaron Ralph Archuleta

– Highly Scaleable Simulations: Hercules Tool Chain– Finite Element Dynamic Rupture Codes

18:30 Cocktails (Mediterranean Room)

19:00 Dinner (Mediterranean Room)

20:00 Poster Session in (Mesquite and Oleander Rooms)

20:00 SCEC AC Meeting (Board Room) Sean Solomon Tom Jordan

5

Page 9: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Wednesday, September 14

Session V: Chair: Tom Jordan (Grand Ballroom)

07:00 Continental Breakfast Desert Conference Center

08:00 CEO/College Earthquake Course Working Group Mark Benthien

09:15 Advisory Council Report Sean Solomon

09:30 Meeting Summary: Group LeadersFocus and Disciplinary Group Reports and Discussion(10 minutes each)

11:00 Wrap-Up and Planning for 2006 Tom Jordan

12:00 SCEC Board Meeting (Whitewater)

SCEC PC Meeting (Snow Creek)

13:00 FARM Workshop and Field Trip Jim EvansJudi ChesterFred Chester

6

Page 10: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

3D Rupture Dynamics Code Validation Workshop-2005Conveners: Ruth Harris and Ralph Archuleta

Mesquite A/B

Sunday, September 11

08:30-08:55 Workshop Introduction(Ruth Harris/Ralph Archuleta)

09:00-09:20 Comparison of Two Spontaneous Rupture Methods(Steve Day/Luis Dalguer/Nadia Lapusta/Yi Liu)

09:25-09:45 A New Feature in the SEM Code(Jean Paul Ampuero)

09:45-10:05 Break

10:10-11:30 The Problem Versions 4+5 Comparisons/Discussion(Harris/Archuleta)

11:30-12:30 Lunch

12:30-12:50 A New SCEC IT Visualization Tool(Kim Olsen)

12:55-13:15 The Reference Earthquakes Digital Library Rupture Mode Format(Brad Aagaard)

13:15-14:00 General Discussion and Future Plans

7

Page 11: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Fault Displacement Parameters Input for the Working Group on California Earthquake Probabilities

Conveners: Chris Wills and Mark PetersenWhitewater Room

September 11, 2005

20:00 WGCEP project overviewFunding, management structure, end usersDeadlines and deliverablesTypes of modeling to be doneNeeded databases and interfacesOverview of technical implementation of projectOverview of how current data model came to be

21:00 Discussion of fault location information2002 NSHM modelCFMMerged California modelUpdates needed for the NSHM and WGCEP

21:30 Summaries of current databases and slip rate values. 2002 modelRecent geodetic modelDiscussion: New slip rate data since 2002

How to summarize slip data for fault section database

22:00 Summaries of new database for fault displacement. Need for development of deformation modelProposed database structureProposed input interfaceDiscussion: How slip data fits in proposed database

How to summarize slip data for fault section database

8

Page 12: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Section II

2005 SCEC Annual Meeting ParticipantsLastname Firstname Organization Email AddressAagaard Brad United States Geological Survey [email protected] Norman Pacific Gas & Electric [email protected] Duncan University of California, San Diego [email protected] Sinan University of California, Irvine [email protected] Kristy University of Southern California [email protected] Bettina University of California, San Diego [email protected] John University of Nevada, Reno [email protected] Ryosuke Columbia University [email protected] D. Joe United States Geological Survey [email protected] Rasool University of Nevada, Reno [email protected] Vikki California Institute of Technology [email protected] David United States Geological Survey [email protected] Ralph University of California, Santa Barbara [email protected] Donald Jet Propulsion Laboratory [email protected] Ramon Arizona State University [email protected] Aris Honeywell/USGS [email protected] Dominic Georgia Institute of Technology [email protected] Iain University of Southern California [email protected] William United States Geological Survey [email protected] Derik United States Geological Survey [email protected] Paolo AIR Worldwide [email protected] N. M. United States Geological Survey [email protected] Nathan Harvard University [email protected] Mark Southern California Earthquake Center [email protected] Yehuda University of Southern California [email protected] Greg Stanford University [email protected] Addie University of Oregon [email protected] Harsha Harvard University [email protected] Glenn University of Nevada, Reno [email protected] Ronald University of Southern California [email protected] Peter University of California, Los Angeles [email protected] Natanya University of California, Los Angeles [email protected] Michael United States Geological Survey [email protected] Frederick UNAVCO [email protected] Yehuda University of California, San Diego [email protected] Naomi Stanford University [email protected] Adrian University of California, San Diego [email protected] David California State University, Fullerton [email protected] Oliver United States Geological Survey [email protected] Yousef University of California, Berkeley [email protected] Dave California Geological Survey [email protected] Tom United States Geological Survey [email protected]

9

Page 13: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Brodsky Emily University of California, Los Angeles [email protected] James University of Nevada, Reno [email protected] William California Geological Survey [email protected] Douglas University of California, Santa Barbara [email protected] Ana Central Washington University [email protected] Scott University of Southern California [email protected] Kenneth EQECAT, Inc. [email protected] Cameron San Diego State University [email protected] Tianqing California Geological Survey [email protected] John Central Washington University [email protected] Rufus United States Geological Survey [email protected] Hans University of Southern California [email protected] Chung-Han United States Geological Survey [email protected] Diana University of Pennsylvania [email protected] Po University of Southern California [email protected] Judith Texas A&M University [email protected] Fred Texas A&M University [email protected] Julia San Diego State University [email protected] Robert California Institute of Technology [email protected] Elizabeth University of California, Los Angeles [email protected] Amy Macalester College [email protected] Harmony California State University, Fullerton [email protected] James United States Geological Survey [email protected] Michele University of Massachusetts [email protected] Ilene University of Southern California [email protected] Dana University of Southern California [email protected] Christopher Arizona State University [email protected] Yifeng San Diego Supercomputer Center [email protected] Susana University of California, Santa Barbara [email protected] Alexei Purdue University [email protected] Luis San Diego State University [email protected] Paul University of California, Los Angeles [email protected] Steven San Diego State University [email protected] Groot Robert University of Southern California [email protected] Dain SCIGN / USC [email protected] James University of California, Riverside [email protected] Renata Harvard University [email protected] James University of Southern California [email protected] Robert United States Geological Survey [email protected] Ory University of Southern California [email protected] Benchun University of California, Riverside [email protected] Eric Harvard University [email protected] Bill United States Geological Survey [email protected] Geoffrey University of California, San Diego [email protected] Edgar University of Southern California [email protected] James Utah State University [email protected] Marcio San Diego Supercomputer Center [email protected] Zijun University of California, Riverside [email protected]

10

Page 14: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Fay Noah University of Oregon nfay@darkwing,uoreogon.eduFelzer Karen United States Geological Survey [email protected] Yuri University of California, San Diego [email protected] Edward United States Geological Survey [email protected] Yaron University of British Columbia [email protected] Adam University of Southern California [email protected] Rachel Winona State University [email protected] Hunter University of Southern California [email protected] Kurt University of Southern California [email protected] Jeffrey University of Alaska [email protected] Erik University of Southern California [email protected] Gary United States Geological Survey [email protected] Thomas United States Geological Survey [email protected] Joshua University of Southern California [email protected] Matt United States Geological Survey [email protected] Yolanda University of Southern California [email protected] David Brown University [email protected] Christian University of California, Davis [email protected] Vladimir California Geological Survey [email protected] Lisa University of California, Irvine [email protected] Robert URS Corporation [email protected] Sridhar University of Southern California [email protected] Nitin University of Southern California [email protected] Vipin University of Southern California [email protected] Hamid California Geological Survey [email protected] Bradford Massachusetts Institute of Technology [email protected] Kathleen United States Geological Survey [email protected] Yariv University of California, San Diego [email protected] Tom United States Geological Survey [email protected] Ifrraz University of Southern California [email protected] Jeanne United States Geological Survey [email protected] Rebecca University of California, Los Angeles [email protected] Ruth United States Geological Survey [email protected] Egill California Institute of Technology [email protected] Tyler University of Western Ontario [email protected] Elizabeth University of British Columbia [email protected] Thomas California Institute of Technology [email protected] Agnes Columbia University [email protected] Sally University of Southern California [email protected] Thomas University of Southern California [email protected] Daniela University of California, Riverside [email protected] Thomas Massachusetts Institute of Technology [email protected] James University of California, Davis [email protected] Susan United States Geological Survey [email protected] Heidi University of California, Los Angeles [email protected] Yuanfang San Diego Supercomputer Center [email protected] Kenneth United States Geological Survey [email protected] Eugene University of Oregon [email protected]

11

Page 15: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Husker Allen University of California, Los Angeles [email protected] Tara University of California, Irvine [email protected] Gene URS Corporation [email protected] David University of California, Los Angeles [email protected] Susanne Utah State University [email protected] Jadd Temple University [email protected] Chen University of California, Santa Barbara [email protected] Rosa San Diego State University [email protected] Ingrid University of California, Berkeley [email protected] Leonard National Science Foundation [email protected] Lucile United States Geological Survey [email protected] Thomas University of Southern California [email protected] Deborah Scripps Institution of Oceanography [email protected] Yoshihiro California Institute of Technology [email protected] Sharon Jet Propulsion Laboratory [email protected] Katherine United States Geological Survey [email protected] Debi University of California, San Diego [email protected] Nancy United States Geological Survey [email protected] Robert Massachusetts Institute of Technology [email protected] Hiroko Texas A&M University [email protected] Bill Boston University [email protected] Christian Lamont-Doherty Earth Observatory [email protected] Aaron University of Southern California [email protected] Albert University of Texas, Austin [email protected] Swaminathan California Institute of Technology [email protected] Nadia California Institute of Technology [email protected] Daniel University of California, Santa Barbara [email protected] Michael North Carolina State University [email protected] Mark Legg Geophysical [email protected] Lorraine University of Southern California [email protected] Shoshana University of Western Ontario [email protected] Michael University of Southern California [email protected] Yong-Gang University of Southern California [email protected] Guoqing University of California, San Diego [email protected] Scott William Lettis & Associates, Inc. [email protected] Caitlin San Diego State University [email protected] Yi California Institute of Technology [email protected] Pengcheng University of California, Santa Barbara [email protected] Yajing Harvard University [email protected] Zhen Stanford University [email protected] Amanda California State, San Bernardino [email protected] Julio Carnegie Mellon University [email protected] Nicolas United States Geological Survey [email protected] Gregory Harvey Mudd College [email protected] Shuo University of California, Santa Barbara [email protected] Kuo-Fong National Central University, Taiwan [email protected] Christopher Earth Consultants International [email protected] Philip University of Southern California [email protected]

12

Page 16: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Magistrale Harold San Diego State University [email protected] John University of Southern California [email protected] Scott University of Massachusetts [email protected] Stephen University of Hawaii [email protected] Lia Colorado School of Mines [email protected] Sally California State, San Bernardino [email protected] Jeff Woods Hole Oceanographic Institution [email protected] John University of Southern California [email protected] Brendan Harvard University [email protected] John University of Southern California [email protected] Andrew Oregon State University [email protected] Rob San Diego State University [email protected] Martha California Dept. of Transportation [email protected] Tim San Diego State University [email protected] David United States Geological Survey [email protected] Kate University of Texas, El Paso [email protected] Bernard University of California, San Diego [email protected] Diane United States Geological Survey [email protected] Julia Rice University [email protected] Joachim Harvard University [email protected] Joanna Inc. Research Insts for Seismology [email protected] Janice United States Geological Survey [email protected] Jessica United States Geological Survey [email protected] Mark University of California, Berkeley [email protected] Sarah Indiana University-Purdue Indianapolis [email protected] Lewis University of California, Santa Cruz [email protected] Craig University of California, Santa Barbara [email protected] William United States Geological Survey [email protected] David University of California, Riverside [email protected]'Hallaron David Carnegie Mellon University [email protected] David University of Southern California [email protected] Kim San Diego State University [email protected] Anna California Institute of Technology [email protected] Ee Ling University of Southern California [email protected] Michael University of North Carolina [email protected] Susan University of Southern California [email protected] Mehmet University of Texas, Austin [email protected] Morgan University of California, Santa Barbara [email protected] Zhigang University of California, Los Angeles [email protected] Justin University of Southern California [email protected] Sue University of Southern California [email protected] Mark United States Geological Survey [email protected] Kristin University of California, San Diego [email protected] David UNAVCO [email protected] Arben URS Corporation [email protected] Katrin University of Southern California [email protected] Andreas Harvard University [email protected] Joel University of British Columbia [email protected]

13

Page 17: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Pollitz Fred United States Geological Survey [email protected] Daniel United States Geological Survey [email protected] Keith California Institute of Technology [email protected] Peter University of Southern California [email protected] Vikas Case Western Reserve University [email protected] William UNAVCO [email protected] German University of California San Diego [email protected] Kenton University of California, Santa Barbara [email protected] Matthew University of Nevada, Reno [email protected] Daniel Scripps Institution of Oceanography [email protected] David Ohio State University [email protected] Leonardo Carnegie Mellon University [email protected] Charles California Geological Survey [email protected] Jeff California State University, Fullerton [email protected] James Harvard University [email protected] Randy University of Southern California [email protected] Thomas University of Southern California [email protected] Thomas San Diego State University [email protected] Garry Geological Survey of Canada [email protected] Otilio San Diego State University [email protected] Nick Pasadena City College [email protected] Badie California Geological Survey [email protected] Justin Stanford University [email protected] John University of California, Davis [email protected] Paul Harvey Mudd College [email protected] Thomas University of Southern California [email protected] Holly United States Geological Survey [email protected] Michael United States Geological Survey [email protected] Amir University of California, Los Angeles [email protected] Amos San Diego State University [email protected] Charles University of Southern California [email protected] David Columbia University [email protected] Katherine Appliachian State University [email protected] Celia Oregon State University [email protected] Emily Oregon State University [email protected] James University of Nevada, Reno [email protected] Gordon San Diego State University [email protected] Hope ABS Consulting [email protected] Bruce Columbia University [email protected] John Harvard University [email protected] Peter University of California, San Diego [email protected] Kaye National Science Foundation [email protected] Zheng-Kang University of California, Los Angeles [email protected] Lori Canisius College [email protected] Zheqiang University of Southern California [email protected] Wendy United States Geological Survey [email protected] Gerry California State University, Northridge [email protected] Mark California Institute of Technology [email protected]

14

Page 18: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Sisk Matthew San Diego State University [email protected] Adam Harvey Mudd College [email protected] Norman Stanford University [email protected] Jeremie University of Southern California [email protected] Deborah California Institute of Technology [email protected] Marsha California State University, Fullerton [email protected] James Pasadena City College [email protected] Sean Carnegie Institution of Washington [email protected] Paul URS Corporation [email protected] Christopher University of California, Santa Barbara [email protected] Paul United States Geological Survey [email protected] Melinda University of California, Santa Barbara [email protected] Keith Stark Consulting, LLC [email protected] Emily University of Tulsa [email protected] Jamison University of California, Santa Barbara [email protected] Ross United States Geological Survey [email protected] Mark Inst. of Geo. and Nuclear Sciences [email protected] Matthew DePauw University [email protected] Michael University of North Carolina [email protected] Peter Harvard University [email protected] Jennifer University of Southern California [email protected] Ricardo Carnegie Mellon University [email protected] Elizabeth Harvard University [email protected] Ta-liang University of Southern California [email protected] Hong Kie URS Corporation [email protected] Kristy University of Western Ontario [email protected] Tim CartoGraph.com [email protected]é Nathan Arizona State University [email protected] Jelena University of California, Los Angeles [email protected] Jerry California Geological Survey [email protected] Kenichi University of California, Santa Barbara [email protected] Tiankai Carnegie Mellon University [email protected] Terry Brown University [email protected] Donald University of California, Davis [email protected] Buren Jason Santa Monica College [email protected] Zandt Afton San Diego State University [email protected] Danielle San Diego State University [email protected] John University of California, Los Angeles [email protected] Michael University of California, Riverside [email protected] Kris University of California, San Diego [email protected] Christian UNAVCO [email protected] Steven University of California, Santa Cruz [email protected] Kristin California State University, Fullerton [email protected] Neta University of Southern California [email protected] Ray University of Oregon [email protected] Maximilian University of California, Los Angeles [email protected] Shelly University of Southern California [email protected] Steven University of Nevada, Reno [email protected]

15

Page 19: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Wesson Rob United States Geological Survey [email protected] Charles Rensselaer Polytechnic Institute [email protected] Patrick P. Williams Assoc. [email protected] Chris California Geological Survey [email protected] Loren San Diego State University [email protected] Masumi California Institute of Technology [email protected] Zhimei California Institute of Technology [email protected] Wenzheng University of Southern California [email protected] Robert Oregon State University [email protected] Alan United States Geological Survey [email protected] William California Spatial Reference Center [email protected] Ilya University of California, Los Angeles [email protected] Eva National Science Foundation [email protected] Howard Stanford University [email protected] Jeremy University of Southern California [email protected] Yuehua United States Geological Survey [email protected] Jing San Diego Supercomputer Center [email protected] Olaf Arizona State University [email protected]

16

Page 20: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

SECTION III

SCEC ORGANIZATION – 2005

Center Director: Thomas H. JordanUniversity of Southern California

Deputy Director: Ralph ArchuletaUniversity of California, Santa Barbara

Associate Director for Administration: John K. McRaneyUniversity of Southern California

Associate Director for Communication, Mark BenthienEducation, and Outreach University of Southern California

IT Architect: Phil MaechlingUniversity of Southern California

Project Specialists: Sally HenyeyShelly WernerDana Coyle

Education Specialist: Robert deGroot

Programmer and Webmaster: John Marquis

FIS and UseIT Intern Program Manager: Sue Perry

Programmer Analysts: Nitin Gupta Vipin GuptaHunter Francoeur

Systems Programmer: John Mehringer

17

Page 21: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

2005 SCEC BOARD OF DIRECTORS

Chair: Thomas H. Jordan, USC

Vice-Chair: Greg Beroza, Stanford

Members: Jim Brune, UNRDoug Burbank, UCSB*

Steve Day, San Diego StateJames Dieterich, UCRBill Ellsworth, USGS/Menlo ParkLisa Grant, UCIEmily Brodsky, UCLATom Heaton, CaltechTom Herring, MITLucy Jones, USGS/Pasadena*

Bernard Minster, UCSD*

Jim Rice, HarvardBruce Shaw, ColumbiaTerry Tullis, BrownRob Wesson, USGS/Golden

*Members of the Executive Committee of the Board

18

Page 22: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

2005 SCEC PLANNING COMMITTEE

Chair: Ralph Archuleta

Earthquake Source Physics Focus Group: Ruth HarrisDavid Oglesby

Seismic Hazard Analysis Focus Group: Ned FieldDavid Jackson

Structural Representation Focus Group: John ShawJeroen Tromp

Ground Motion Focus Group: Paul DavisRobert Graves

Fault Systems Focus Group: Brad HagerSally McGillJim DieterichCharles Sammis

Geology Disciplinary Group: Tom RockwellMike Oskin

Geodesy Disciplinary Group: Duncan AgnewMark Simons

Seismology Disciplinary Group: John VidalePeter Shearer

Fault and Rock Mechanics Group: Terry Tullis Judith Chester

Implementation Interface: Paul SomervilleRob Wesson

ITR: Phil MaechlingBernard Minster

WinSAR: Mark Simons

Borderlands: Craig Nicholson

19

Page 23: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

2005 SCEC ADVISORY COUNCIL

Sean SOLOMON (Chair), Carnegie Institution of WashingtonDept. of Terrestrial Magnetism5241 Broad Branch Road, N.W., Washington, DC 20015-1305 [email protected]

Gail ATKINSON, Carlton University2240 Herzberg Building, Ottawa, Ontario, K1S 5B6, [email protected]

Lloyd CLUFF, Pacific Gas and Electric, P.O. Box 770000, MC N4C, San Francisco, CA 94177 [email protected]

Jeffery FREYMUELLER, University of Alaska, Geophysical InstituteP.O. Box 757320, Fairbanks, AK 99775-7320 [email protected]

Patti GUATTERI, Swiss Reinsurance,75 King Street, Armonk, NY, [email protected]

Kate MILLER, University of Texas at El PasoDepartment of Geology, 700 W. University AvenueEl Paso, TX 79968 [email protected]

Jack MOEHLE, Pacific Earthquake Eng. Research Center1301 S. 46th St., Bldg. 451Richmond, CA [email protected]

Garry ROGERS, Geological Survey of CanadaBox 6000 Sidney, V8L 4B2, BC, Canada [email protected]

Chris ROJAHN, Applied Technology Council201 Redwood Shores Parkway, Suite 240Redwood City, CA 94065 [email protected]

John RUDNICKI, Department of Civil and Environmental EngineeringA333 Technological Institute2145 Sheridan Road, Evanston, IL [email protected]

Ellis STANLEY, City of Los Angeles, Emergency Preparedness Department200 N. Main Street, Room 1500

20

Page 24: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Los Angeles, CA [email protected]

21

Page 25: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

SECTION IV

Tribute to Kei Aki

1

Page 26: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

29

Page 27: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

SECTION V

State of SCEC, 2005THOMAS H. JORDAN

Director, Southern California Earthquake Center

I welcome you all to the 2005 Annual Meeting in Palm Springs. This will be the fourth community-wide gathering since SCEC was reconfigured as a free-standing center on February 1, 2002. The past year has been exceptionally active, and the meeting agenda is chock full of sessions where the results of your efforts will be presented and discussed. A particularly notable accomplishment was the submission of the SCEC3 proposal to the National Science Foundation and U.S. Geological Survey, which maps out the Center’s plans for 2007-2012. As of this writing, we have not received official word about the status of our proposal, but the site review in early June seemed to go very well, and I am optimistic that our plan will be accepted by both agencies. I will summarize elements of the SCEC3 plan in my report below.

In addition to the working group sessions, the agenda features some outstanding science presentations, an outstanding set of science posters, as well as a variety of IT demonstrations, education & outreach activities, and social gatherings. I look forward to participating with you in all of these events.

Figure 1. Registrants at SCEC Annual Meetings, 1991-2004.

Organization and LeadershipSCEC is an institution-based center, governed by a Board of Directors who represent its

members. Over the past year, UC Riverside became a core institution, and two new organizations, the University of Utah and the Institute of Geological and Nuclear Sciences (New Zealand), joined as participating institutions, raising the membership to 15 core institutions and 40 participating institutions (Table 1). A January, 2005, census indicated that 565 scientists and

30

Page 28: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

other experts are involved in active SCEC projects, which makes SCEC one of the largest collaborations in all of geoscience. Another measure of SCEC involvement—registrants at our annual meetings—is shown for the entire history of the Center in Figure 1.

Board of Directors. Under the SCEC2 by-laws, each core institution appoints one board member, and two at-large members are elected by the Board from the participating institutions. This year we welcome to the Board Prof. Jim Dieterich, who now represents our newest core institution, UC Riverside. The other 16 members of the Board are Greg Beroza (Vice-Chair/Stanford), Emily Brodsky (UCLA), Jim Brune (UNR), Doug Burbank (UCSB), Steve Day (SDSU), Bill Ellsworth (USGS-Menlo Park), Lisa Grant (At-Large), Tom Heaton (Caltech), Tom Herring (MIT), Lucy Jones (USGS-Pasadena), Bernard Minster (UCSD), Jim Rice (Harvard), Bruce Shaw (Columbia), Terry Tullis (At-Large), Rob Wesson (USGS-Golden), and myself (Chair/USC). John McRaney continues to act with his characteristic efficiency and effectiveness as Executive Secretary to the Board.

Planning Committee. One of our most important organizations is the SCEC Planning Committee, which is chaired by Ralph Archuleta, SCEC’s Deputy Director. The PC has the responsibility for formulating the Center’s science plan, conducting proposal reviews, and recommending projects to the Board for SCEC support. Its membership includes the leaders of the major SCEC working groups—disciplinary committees, focus groups, and special project groups (Table 2).

31

Page 29: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Table 2. SCEC Working Group Leadership _____

Disciplinary CommitteesSeismology: John Vidale (chair)*

Peter Shearer (co-chair)Geodesy: Duncan Agnew (chair)*

Mark Simons (co-chair)Geology: Tom Rockwell (chair)*

Mike Oskin (co-chair)Fault & Rock Mechanics: Terry Tullis (chair)*

Judith Chester (co-chair)

Focus GroupsStructural Representation: John Shaw (leader)*

Jeroen Tromp (co-leader)Fault Systems: Brad Hager (leader)*

Sally McGill (co-leader)James Dieterich (co-leader)

Earthquake Source Physics: Ruth Harris (leader)*David Oglesby (co-leader)

Ground Motions: Paul Davis (leader)*Robert Graves (co-leader)

Seismic Hazard Analysis: Ned Field (leader)*David Jackson (co-leader)

Implementation Interface: Paul Somerville (leader)*Robert Wesson (co-leader)

Special Project GroupsSCEC/ITR Project: Bernard Minster (liaison)*Borderland Working Group: Craig Nicholson (chair)*WInSAR Working Group Mark Simons (chair)* _____ * Planning Committee members

Advisory Council. The Center’s external Advisory Council is charged with developing an overview of SCEC operations and advising the Director and the Board. Since the inception of SCEC in 1991, the AC has played a major role in maintaining the vitality of the organization and helping its leadership chart new directions.

The AC’s 2004 report focused on several key issues regarding the formulation of the SCEC3 proposal, and its analysis influenced how the proposal was constructed in a very positive way. A verbatim copy of this report is included in this meeting volume. In February, 2005, the Council reviewed an intermediate draft of the SCEC3 proposal, and they rendered advice which proved very valuable in finalizing the document.

AC members serve three-year terms. During this past year, five members rotated off the Council: Raul Madariaga (Ecole Normale Superieure), Farzad Naeim (John A. Martin & Associates), Haresh Shah (RMS, Inc.), Robert Smith (U. Utah,), and Susan Tubbesing (EERI).

32

Page 30: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

We thank them for their distinguished service. We note that one of the retiring members, Bob Smith, who served as AC Chair from 2000-2004, will remain involved in SCEC as the representative for one of our new institutions, the University of Utah. (Remember, Bob, you can check out any time you want, but you can never leave!)

The current members of the Advisory Council are: Sean Solomon (Carnegie Institution of Washington), who took over as chair of the AC last year, Jeff Freymueller (U. Alaska), Jack Moehle (PEER), Garry Rogers (Geological Survey Of Canada), Chris Rojahn (Applied Technology Council).

Added to their ranks are five new members, whom we welcome to the AC at this meeting: Gail Atkinson, Carlton University), Lloyd Cluff (Pacific Gas and Electric Co.), Patti Guatteri (Swiss Reinsurance), Kate Miller (University of Texas at El Paso), and John Rudnicki (Northwestern University). We are very fortunate to have such an exceptional group of experts providing the Center with advice.

Working Groups. The SCEC organization comprises a number of disciplinary committees, focus groups, and special project teams. These working groups are the engines of its success, and the discussions they organize at the annual meeting provide critical input to our reporting and planning processes.

The Center sustains disciplinary science through its standing committees in Seismology, Tectonic Geodesy, Earthquake Geology, and Fault and Rock Mechanics (a.k.a. the FARMers). These committees are responsible for coordinating disciplinary activities relevant to the SCEC science plan, and they make recommendations to the Planning Committee regarding the support of disciplinary activities and infrastructure.

Interdisciplinary research is organized into five science focus areas: Structural Representation, Fault Systems, Earthquake Source Physics, Ground Motion, and Seismic Hazard Analysis. The focus groups are the crucibles for the interdisciplinary synthesis that lies at the core of SCEC’s mission. For that reason, a substantial fraction of this annual meeting will be devoted to reviewing the focus-group activities and discussing their plans.

SCEC activities classified under special projects include Southern California Integrated GPS Network (SCIGN), the WInSAR Consortium, the Borderland Working Group, and the Community Modeling Environment (CME), which is being developed under the SCEC/ITR project.

Following the recommendation of the SCEC Advisory Council in their 2004 report, the SCEC Board of Directors formed a subcommittee to review the future of the SCIGN organization as a standing committee of SCEC.

SCEC's role in oversight of SCIGN began in 1996, primarily to insure that adequate funding could be found to build the array of 250 continuous GPS stations in Southern California and to monitor the construction of the network. With the completion of the SCIGN array in 2002 and a long-term plan in place by 2004 to maintain the stations through a combination of support through PBO/UNAVCO/NSF, the USGS office in Pasadena, and the surveying community through UCSD, SCEC had completed its objectives. In the light of these developments, particularly the initiation of the EarthScope Project, the Board subcommittee recommended that SCEC disband the SCIGN board as a standing committee of SCEC and that all future geodetic activities in SCEC be coordinated by the Tectonic Geodesy disciplinary committee.

The recommendation of the subcommittee was approved by the Board, the director, and by agency officials of NSF and the USGS.  The director notified the SCIGN group of the decision

33

Page 31: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

in late June, 2005. The SCEC administration will continue to assist the group in the array maintenance transition.

The WInSAR standing committee is now seeking offers for a new host institution as it hopes to renew itself as a global SAR archive.

Interdisciplinary research in risk assessment and mitigation is a primary subject for collaboration between SCEC scientists and partners from other communities—earthquake engineering, risk analysis, and emergency management. These partnerships are facilitated by an Implementation Interface, a structure based within the CEO program and designed to foster two-way communication and knowledge transfer. Representatives from a number of partnering organizations will be attending this meeting, and we should use this opportunity to discuss how our efforts toward implementing science for public benefit can be improved.

Communication, Education, and Outreach.  Through its CEO Program, SCEC offers a wide range of student research experiences, web-based education tools, classroom curricula, museum displays, public information brochures, online newsletters, and technical workshops and publications.       Much progress has been made on the development of the Electronic Encyclopedia of Earthquakes (E3), a collaborative project with CUREE and IRIS. The E3 development system is now fully operational, and a number of encyclopedia entries are in the pipeline. When complete, E3 will include information and resources for over 500 Earth science and engineering topics, with connections to curricular materials useful for teaching Earth science, engineering, physics and mathematics.     The “Earthquake Country Alliance,” organized to coordinate activities for the 10-year anniversary of the Northridge Earthquake in 2004, has continue to work together. The Alliance presents common messages, shares and promotes existing resources, and develops new activities and joint products, such as the new version of Putting Down Roots in Earthquake Country, now in distribution. It can be downloaded from www.earthquakecountry.info/roots/. Earthquakecountry.info is a multi-organizational collaboration to inform the public about earthquake hazards and safety, organized and hosted by SCEC. In 2005 SCEC worked with a large group of Bay Area scientists, engineers, and emergency managers, led by the USGS, to create the first Bay Area version of Roots (soon available through the www.earthquakecountry.info/roots/ website also) and a Spanish-language edition is being prepared for printing in late 2005. A new video, Written in Stone: Earthquake Country – Los Angeles, has been produced and will be distributed in curricular kits to schools and community groups. The Los Angeles Unified School District plans to include the video in a new sixth-grade science unit, in every school in the district.     SCEC’s Summer Intern program has grown to a new level and now has a year-round counterpart with students working on IT projects at USC and other institutions.  Since last summer, 33 students have participated in the program, including 11 students working with scientists throughout SCEC and 22 students enrolled in the USC-based Undergraduate Studies in Earthquake Information Technology (UseIT) program.

34

Page 32: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Center Budget and Project FundingThe 2005 base funding for the Center is $2,622K from the National Science Foundation and

$1,100K from the U.S. Geological Survey. The NSF funding was 9.3% below SCEC’s expected allocation and 5% less than last year, owing to budgetary problems within EAR. The base funding of $3.722M was augmented with $60K from the California Earthquake Authority and $50K held over from 2004 for the NGA-H initiative. 

The base budget approved by the Board of Directors for this year allocated $2,750K for science activities managed by the SCEC Planning Committee; $380K for communication, education, and outreach activities, managed by the CEO Associate Director, Mark Benthien; $142K for information technology, managed by Information Architect, Phil Maechling; $280K for administration and $150K for meetings, managed by the Associate Director for Administration, John McRaney; and $130K for the director's reserve account. In addition, the Center received $2,000K from NSF's Information Technology Research (ITR) Program for continuing development of the SCEC Community Modeling Environment, and $511K from NSF's Engineering program for the SCEC research in seismic risk reduction.  The project managers for the ITR and Engineering grants are Phil Maechling and Paul Somerville, respectively.

I will use the opportunity to review how science projects have been funded as part of the SCEC collaboration, since this ongoing process will be a major concern of the annual meeting. The process of structuring the SCEC program for 2005 began with the working-group discussions at our last annual meeting in September, 2004.  An RFP was issued in October, 2004, and 149 proposals (128 projects, considering collaborations) requesting a total of $4,520K were submitted in November, 2004.  All proposals were independently reviewed by the director and deputy director.  Each proposal was also independently reviewed by the chairs and/or co-chairs of three relevant focus groups or disciplinary committees. (Reviewers were required to recuse themselves when they had a conflict of interest.) The Planning Committee met on January 17-18, 2005, and spent two long days discussing every proposal.  The objective was to formulate a coherent, budget-balanced science program consistent with SCEC's basic mission, short-term objectives, long-term goals, and institutional composition. Proposals were evaluated according to the following criteria:

a. Scientific merit of the proposed research.b. Competence and performance of the investigators, especially in regard to past SCEC-

sponsored research.c. Priority of the proposed project for short-term SCEC objectives.d. Promise of the proposed project for contributing to long-term SCEC goals.e. Commitment of the P.I. and institution to the SCEC mission.f. Value of the proposed research relative to its cost.g. The need to achieve a balanced budget while maintaining a reasonable level of scientific

continuity given the very limited Center funding.

The recommendations of the PC were reviewed by the SCEC Board of Directors at a meeting on February 7-8.  The Board voted unanimously to accept the PC's recommendations, pending a final review of the program by the Center director, which was completed on February 25.

On June 7, the SCEC Planning Committee met after the SCEC3 site visit and began the processes of formulating the 2005 RFP, and their draft will being put up for scrutiny at this

35

Page 33: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

annual meeting. I urge you to participate fully in these discussions. Based on the community input, the PC will modify their draft, and the final RFP will be released in October.

AccomplishmentsMany of the scientific results of the SCEC collaboration are detailed in the abstracts of

presentations and posters included in this Meeting Volume, and others will be discussed in the working-group sessions throughout the Annual Meeting.

This was an exceptional busy year in terms of SCEC initiatives. For example, we developed a plan in collaboration with the USGS and CGS for a new Working Group on California Earthquake Probabilities, which has been chartered to produce a time-dependent uniform California earthquake rupture forecast by 2007. WGCEP’07, which is led by Ned Field, will report to a management oversight committee (MOC) chaired by the SCEC director.

The California Earthquake Authority (CEA) has allocated $1.75 million for the project, which will be managed by SCEC through the MOC. The CEA has also funded SCEC projects in ground motion attenuation studies and end-to-end (“ruptures-to-rafters”) simulations.

Plans were also formulated for a set of international collaborations under the new Multinational Partnership in Earthquake System Science (MPRESS), a new Collaboratory for the Study of Earthquake Predictability (CSEP), and several other initiatives. These are outlined in the SCEC3 proposal (see the SCEC3 Proposal Summary below).

Rather than attempt a synopsis of the many projects that SCEC is coordinating, I will simply list several documents which you can download from the web to find detailed reports (http://www.scec.org/aboutscec/documents/). Of course, you can find a lot more information about SCEC activities through our main webportal (http://www.scec.org).

SCEC 2004 Annual Report (December, 2004). This large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

I. Introduction II. Planning, Organization, and Management of the Center III. Research Accomplishments IV. Communication, Education, and Outreach Activities V. Director's Management Report VI. Advisory Council Report VII. Financial Report VIII. Report on Subawards and Monitoring IX. Demographics of SCEC Participants X. Report on International Contacts and Visits XI. Publications Appendices: Long-Term Research Goals, By-Laws, and 2005 RFP

SCEC/CME 2004 Annual Report (June, 2005). In 2001, SCEC was funded by NSF's ITR Program for a large project ($10M for 5 yr) to develop a new information infrastructure for earthquake science—the SCEC “Community Modeling Environment” (CME). The fourth annual report on the CME (89 pp., 6.3 MB) can be downloaded from the SCEC document

36

Page 34: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

website. Further information, including a wide variety of products, capabilities, and reports can be found on the CME website (http://epicenter.usc.edu/cmeportal/).

Putting Down Roots in Earthquake Country (2005). The new edition of this widely distributed public-information document was released by SCEC and the USGS on the anniversary of the Northridge earthquake and has been updated twice since. It can be downloaded from the new website (www.earthquakecountry.info/roots/). Earthquakecountry.info is a multi-organizational collaboration to inform the public about earthquake hazards and safety, organized and hosted by SCEC. In 2005 SCEC worked with a large group of Bay Area scientists, engineers, and emergency managers, led by the USGS, to create the first Bay Area version of Roots (soon available through the www.earthquakecountry.info/roots/ website also) and a Spanish-language edition is being prepared for printing in late 2005.

The SCEC3 ProposalThe current phase of the Center (SCEC2) extends for five years, until January 31, 2007. In

early March, 2005, we submitted the SCEC3 proposal to NSF and the USGS, which lays out our plans to extend Center operations for the 5-year period 2007-2012. The construction of this proposal was a truly collaborative enterprise that stretched over a nine-month period and involved the Board of Directors, Planning Committee, Advisory Council, the entire SCEC staff, and many, many SCEC participants. I want to express my deepest gratitude to all of you for your outstanding efforts on behalf of the Center.

The results were a truly impressive document. I urge all SCEC participants to download it from the SCEC documents webpage (http://www.scec.org/aboutscec/documents/) and read it. The proposal was sent out for mail reviews and, in early June, a group of experts assembled by NSF and the USGS convened for a panel review at USC. We hope to receive word on the status of this proposal by the Annual Meeting. Reproduced below is the Proposal Summary.

SCEC3 Proposal SummaryThe Southern California Earthquake Center was created as a Science & Technology Center

in 1991 by NSF and the USGS. SCEC was renewed in 2002, and its size has since expanded to 54 institutions involving over 560 scientists. The core institutions, currently 15, are committed to SCEC’s mission and offer sustained support for its programs; the participating institutions, currently 40, are self-nominated through their members’ participation.

The Center is open to any credible scientist from any research institution interested in collaborating on the problems of earthquake science. However, its program is structured to achieve prioritized science objectives within the Southern California Natural Laboratory, and resources are allocated accordingly. Research projects are supported on a year-to-year basis by a competitive, collaboration-building process that involves extensive interactions among 14 working groups, a Joint Planning Committee with the USGS, the SCEC Board of Directors, and an External Advisory Council. In 2005, SCEC will sponsor 123 projects by 156 principal investigators at 51 institutions. The overall program includes a number of additional USGS investigators, as well as many collaborators supported by SCEC’s partner organizations.

Science Goal and Mission. SCEC’s basic science goal is to understand the physics of the Southern California fault system and encode this understanding in a system-level model that can

37

Page 35: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

predict salient aspects of earthquake behavior. Southern California’s network of several hundred active faults forms a superb natural laboratory for the study of earthquake physics. Its seismic, geodetic, and geologic data are among the best in the world. Moreover, Southern California contains 23 million people, so that high seismic hazard translates to nearly one-half of the national earthquake risk.

The Center’s tripartite mission statement emphasizes the connections between information gathering, knowledge formulation through physics-based modeling, and public communication of hazard and risk. An important part of SCEC’s mission is to increase the diversity of its scientific workforce; it values diversity in all aspects of its activities.

Intellectual Merit of the Proposed Research. Earthquakes are one of the great unsolved puzzles of science. The study of earthquakes concerns the two basic geophysical problems: (a) the dynamics of fault rupture—what happens on a time scale of seconds to minutes when a single fault breaks during a given earthquake—and (b) the dynamics of fault systems—what happens within a fault network on a time scale of hours to centuries to generate a sequence of earthquakes. These highly nonlinear problems are coupled to one another through the complex processes of brittle and ductile deformation. No theory adequately describes the basic features of dynamic rupture, nor is one available that fully explains the dynamical interactions among faults, because we do not yet understand the physics of how matter and energy interact during the extreme conditions of rock failure. The major research issues of earthquake science are true system-level problems—they require an interdisciplinary, multi-institutional approach that considers the nonlinear interactions among many fault-system elements. SCEC will advance earthquake science through a comprehensive program of system-specific studies in Southern California.

Broader Implications of the Proposed Research. Earthquakes pose the greatest natural threat to the built environment of California and other seismically active regions. Probabilistic seismic hazard analysis (PSHA) is the primary methodology used to ensure the public’s seismic safety. SCEC research will incorporate physics-based methods into PSHA, which will provide better earthquake forecasts and better estimates of strong ground motions. The Center will extend this research beyond Southern California through its national and international research collaborations. Through partnerships with earthquake engineers, it will also generalize the natural system under consideration to include built structures, thereby extending its seismic

38

Page 36: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

hazard analysis to earthquake risk. Through its Communication, Education & Outreach (CEO) Program, it will provide society at large with useful knowledge for reducing earthquake risk.

Accomplishments. SCEC scientists engaged in data collection have come together with theoreticians and numerical modelers in a collaborative process that has greatly accelerated the understanding of seismic hazards in Southern California and elsewhere. The results have been incorporated into practical products, including the National Seismic Hazard Maps of 2002 and the new seismic attenuation relations developed by the Next Generation Attenuation Project. SCEC’s achievements contributed to the launching of NSF’s EarthScope initiative in 2003. For example, the Center developed the 250-station Southern California Integrated GPS Network (SCIGN), the largest outside of Japan, which has served as a prototype for EarthScope’s Plate Boundary Observatory.

This proposal highlights scientific accomplishments in six problem areas central to the earthquake system science.

Fault mechanics. New types of laboratory experiments have elucidated on the frictional resistance during high-speed coseismic slip, and these data have been combined with field studies on exhumed faults to develop better models of dynamic rupture.

Earthquake Rupture Dynamics. Codes for 3D dynamic rupture simulation have been validated by cross-comparison exercises; they are being verified by comparisons with laboratory experiments and real earthquakes and coupled with anelastic wave propagation models to investigate strong ground motions.

Structural Representation. The Community Velocity Model (CVM) has been improved by extending and refining its 3D elastic structure and incorporating attenuation parameters; a new Community Fault Model (CFM) representing more than 140 active faults has been developed and extended to a Community Block Model (CBM), and a prototype Unified Structural Representation (USR) is merging the CVM into the CBM structural framework.

Fault Systems. New deformation signals have been discovered by InSAR and GPS, and new data from SCIGN and GPS campaigns have been incorporated into the Crustal Motion Map (CMM). The geologic record of fault-system behavior has been significantly expanded; tectonic block models have been created for physics-based earthquake forecasting, and finite-element codes have been developed for a new CBM-based deformation model that will assimilate the CMM and geologic data.

Earthquake Forecasting. New paleoseismic data and data-synthesis techniques have been used to constrain earthquake recurrence intervals, event clustering, and interactions among faults. Relocated seismicity has mapped new seismogenic structures and provided better tests of earthquake triggering models. Regional earthquake likelihood models have been formulated for use in PSHA and earthquake predictability experiments, and they are being tested for prediction skill using a rigorous methodology.

Ground Motion Prediction. Earthquake ground motions have been simulated using the CVM, realistic source models, and validated wave-physics codes; high-frequency stochastic methods

39

Page 37: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

have been combined with low-frequency deterministic methods to attain a broadband (0-10 Hz) simulation capability; broadband predictions have been tested against precarious-rock data; and simulations have been used to improve attenuation relationships and create realistic earthquake scenarios.

The CEO program has expanded SCEC partnerships in science, engineering, risk management, government, business, and education; increased earthquake knowledge and science literacy at all educational levels; worked with partners to improve earthquake hazard and risk assessments; and promoted earthquake preparedness, mitigation, and planning. An Implementation Interface has been constructed to integrate physics-based SHA into earthquake engineering research and practice through collaborations with PEER, CUREE, and the Next Generation Attenuation (NGA) Project; it has provided a flexible computational framework for system-level hazard and risk analysis through the OpenSHA platform, and it is developing an interface between SCEC and the NSF Network for Earthquake Engineering Simulation (NEES).

CEO highlights include a very successful new intern program Undergraduate Studies in Earthquake Information Technology (USEIT); the development of the Electronic Encyclopedia of Earthquakes as part of the NSF National Science Digital Library; the establishment of the Earthquake Country Alliance to present consistent earthquake information to the public; and a new edition of Putting Down Roots in Earthquake Country in both English and Spanish.

Science Plan. The SCEC3 Science Plan is articulated in terms of four basic science problems that organize the most pressing issues of earthquake system science.A. Earthquake Source Physics: to discover the physics of fault failure and dynamic rupture that will improve predictions of strong ground motions and the understanding of earthquake predictability. B. Fault System Dynamics: to develop representations of the postseismic and interseismic evolution of stress, strain, and rheology that can predict fault system behaviors. C. Earthquake Forecasting and Predictability: to improve earthquake forecasts by understanding the physical basis for earthquake predictability. D. Ground Motion Prediction: to predict the ground motions using realistic earthquake simulations at frequencies up to 10 Hz for all sites in Southern California.In each problem area, we state the research issues, identify specific objectives, and assess the requisite research activities and capabilities. Based on this assessment, we formulate a new working-group structure to enact the Science Plan.

The SCEC3 Science Plan motivates eight initiatives that will augment the basic research program.

1. Networks as Research Tools: to foster innovations in network deployments and data collection that can provide researchers with new information on earthquake phenomena. Plans include a real-time demonstration project in seismic early warning in partnership with CISN.

2. Southern San Andreas Fault: to mobilize a major effort on the collection and interpretation of geologic data to understand the earthquake history of the SSAF system.

40

Page 38: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

3. Working Group on California Earthquake Probabilities: to develop in partnership with the USGS and CGS a uniform California earthquake rupture forecast by combining new information with the best available methodologies for time-dependent forecasting.

4. Next Generation Attenuation Program: to produce in partnership with PEER-Lifeline and the USGS more reliable ground motion attenuation models that are based on physics as well as data.

5. “Rupture to Rafters”: to develop in partnership with earthquake engineers a capability for the end-to-end simulation of earthquake processes, including embedding built structures in geologic models. This analysis will be used in new types of risk assessment.

6. Collaboratory for the Study of Earthquake Predictability: to provide a stable environment for registering earthquake predictions and conducting long-term predictability experiments that are properly characterized and can be properly evaluated.

7. National Collaborations Through EarthScope: to apply SCEC’s system-level approach to other fault systems in the United States and collaborate on a national scale in comparative studies of fault system dynamics and earthquake behavior.

8. International Collaborations: to develop multinational partnerships that will promote comparative studies of fault systems and international cooperation in earthquake system science.

We outline the objectives of each initiative, its resource requirements, the participants and organizational partners, and the mechanisms that we will pursue to obtain additional resources. The latter is critical, because the ambitious research program proposed for SCEC3, particularly in the realm of applied studies, will require other sources of funding than the Center base budget proposed here.

The CEO program is an essential component of the Science Plan through its management of external partnerships that foster new research opportunities and its delivery of research and educational products to society at large.

In SCEC3, the Center will expand its CEO activities through partnerships with new groups, such as the EarthScope Education & Outreach Program and the NEES Education, Outreach & Training Program. The CEO focus areas will include partnerships in seismic hazard & risk analysis, primarily with research engineers; knowledge transfer partnerships and programs for technical professionals and government officials; education programs and products for students and educators; and public outreach to the general public, civic and preparedness groups, and the news media. As in SCEC2, CEO will organize community development programs for SCEC participants.

Management Plan. SCEC3 will continue to operate under the lean, flexible, and very successful management structure developed for SCEC2. However, to implement the Science Plan, we will make significant changes in the organization of the working groups, as shown on the SCEC3 organization chart.

41

Page 39: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Recognizing that diversity is a long-term issue that requires continuing assessments and constant attention by the leadership, the Center has taken a number of concrete steps to assess the diversity of its workforce and to develop policies for increasing diversity. Tangible progress has been made in populating SCEC leadership positions with outstanding women and minority scientists, and a long-term plan has been enacted to make further improvements. A key pipeline strategy is to recruit minority students into the SCEC intern programs and encourage them to pursue research careers at SCEC institutions. These recruitment and retention activities will be expanded in SCEC3.

In closing, I want to express my thanks to all of you for your attendance at the meeting and your sustained commitment to the SCEC collaboration. Please do not hesitate to contact me personally if you have questions or comments about our activities, accomplishments, and plans.

42

Page 40: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Report of the Advisory CouncilSouthern California Earthquake Center

September 2004 Meeting

Introduction

The Advisory Council of the Southern California Earthquake Center (SCEC) met during the 2004 SCEC Annual Meeting, held in Palm Springs, California, on 19-23 September 2004. The principal meeting of the Council was during the evening of 22 September; an earlier executive session of the Council was held prior to the start of the Annual Meeting on 19 September to outline areas of focus. A report of the principal findings and recommendations was made orally to those attending the Annual Meeting during the closing session on the morning of 23 September.

Prior to the Annual Meeting the SCEC Director circulated to Advisory Council members a three-page list of issues warranting Council attention. Those issues included assessments of SCEC’s system-level approach to earthquake science; SCEC’s partnership activities; the geographic scope of SCEC’s focus; the goals and objectives for the next proposed phase of the Center (so-called SCEC3); and the membership, agenda, and meeting schedule of the Council. For each major issue, the SCEC Director posed a series of specific subsidiary questions.

After some general comments, we group the bulk of our discussion and recommendations below in line with those five issues and the corresponding subsidiary questions.

General Impressions and Recommendations

Because the members of the Advisory Council are not also members of SCEC, the Annual Meeting is of particular importance as a measure of annual progress on the goals and programs of the Center. One metric on that progress is meeting attendance, which continues to grow and reached an all-time high at this year’s meeting. Another is the range of topics on which new results were presented and engaging discussions ensued. Even compared with one year earlier, the diversity of subjects treated and the maturity of much of the Center’s highest-priority work have advanced noticeably.

Presentations on two topics made particularly positive impressions on Advisory Council members. The first is the Community Modeling Environment, the managed computational facility for validating and inter-comparing numerical codes for fault rupture, wave propagation, and other elements of the seismic hazard analysis problem. The combination of state-of-the-art information technology tools for computation and visualization together with the integrative, open approach promises to provide a critical resource both to seismologists and to the engineering and management user communities.

43

Page 41: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

The second is the TerraShake simulation of ground motions from a specified model of fault rupture within a three-dimensional representation of the fault system and seismic velocity structure of Southern California. This computational tour de force, with its compelling visualization of wave propagation and ground accelerations, provided dramatically graphical lessons concerning the effects of rupture directivity and the focusing of energy by sediment-filled basins and other structures. The promise of such simulations for understanding seismic hazards, and for pointing in directions where improved observations or better models would be most worthwhile, is enormous.

On the basis of all of the presentations and discussions at the Annual Meeting, the Advisory Council has several general recommendations to offer.

SCEC should enhance the communication of its activities, accomplishments, and plans to the greater Earth science and earthquake engineering communities and to the public. There is an enormous body of very exciting scientific work being carried out by SCEC members and through SCEC’s partnerships with other organizations. It is the impression of Advisory Council members, however, that the broader community of Earth scientists and earthquake engineers are unfamiliar with much of this effort. SCEC should do more to publicize its work, through organized sets of presentations at professional meetings, publications in professional journals, targeted articles in the lay media, and internet-based materials. Presentations and publications of SCEC-sponsored research should consistently give explicit credit to such sponsorship. Regular updates to SCEC’s web site (including pages currently “under construction”) would also serve to enhance the Center’s visibility as a focus of community-directed activity.

SCEC should develop a plan for how it will coordinate, in partnership with relevant federal and state agencies, a science community response to a large earthquake in southern California. The need for such a plan was underscored by the Parkfield, California, earthquake of 28 September, just one week after the Annual Meeting. As a multi-institutional Center for earthquake science, society will look to SCEC to provide scientific leadership in the immediate aftermath of any large seismic event in the southern California region. SCEC should have a clear protocol for how it will provide that leadership.

SCEC has set out ambitious goals and several milestones to be attained in the pursuit of those goals. Building on those plans, SCEC should develop clear metrics for the successful achievement of its goals. These “success criteria” should be applied both to past activities — in the development of a rationale for continuing SCEC into its next phase — and to activities proposed by the SCEC3 era. The feasibility of satisfying those success criteria for planned efforts will enhance the case for SCEC3.

SCEC’s System-level Approach

A primary goal of SCEC during its second phase as a Center (so-called SCEC2) has been to develop a system-level approach to earthquake science that can improve seismic hazard analysis and contribute to a reduction in earthquake risk. The Advisory Council was specifically asked:

44

Page 42: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

a. Has this approach been successful in advancing earthquake science? Will it lead to substantial improvements in seismic hazard analysis?

b. Is it an appropriate basis for continuing the SCEC collaboration?c. Do [SCEC’s] accomplishments on this problem warrant the continuation of the

program into the next 5-year phase (SCEC3)?

In response to these queries, the Advisory Council affirms that the system-level approach to earthquake science that SCEC has pioneered is novel, appears to be demonstrating substantial progress, and is the type of integrative effort most appropriate for a multi-institutional, mission-focused Center. The approach that SCEC has taken in its system-level representation of tectonic elements and seismic structure in Southern California — including the Community Velocity Model, the Community Fault Model, the Community Crustal Motion Map, the Community Block Model, and the Unified Structural Representation — integrates all available observations through an open process that involves all interested members of the community. In parallel with the development of regional models, through the Community Modeling Environment SCEC has developed a system-level approach to the management of simulations and visualizations and the curation of data products. Most importantly, the well-documented SCEC approach stands as an exemplary model on which similar efforts for other earthquake-prone regions can build.

As noted above, end-to-end simulations of ground motion from specific fault rupture scenarios presented at the Annual Meeting constitute compelling evidence that SCEC’s system-level approach promises to provide a capability for substantial improvements to ground motion estimation and seismic hazard analysis. Whether that capability will modify current methods for such analysis is less clear at present. In large measure, the success of SCEC’s effort to improve the state of seismic hazard analysis will depend on the extent to which the user community adapts SCEC’s tools and methodology into standard practice.

SCEC’s system-level approach to earthquake science is nonetheless a clearly appropriate basis for continuing the SCEC collaboration. The community models and modeling environment are just now reaching levels of maturity to test diverse scenarios for Southern California fault behavior. Further, as mentioned earlier, these models should provide a clear basis for deciding where new observations and observational approaches are needed to fill gaps in knowledge or to foster new monitoring tools. Finding an optimum balance between system-level and observational approaches will be a high priority for SCEC throughout the lifetime of the Center.

The answer to the final question above should be obvious. SCEC’s accomplishments to date readily warrant continuation of the Center’s programs into another 5-year phase (SCEC3). Proposals to federal, state, and private organizations for support of such an endeavor should be prepared as opportunity permits.

45

Page 43: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

SCEC’s Partnership Activities

To accomplish its goal of reducing earthquake risk, SCEC has sought a range of partnerships in earthquake engineering, emergency management, and public outreach and education. The Advisory Council has been asked:

a. How effective has SCEC been in creating and managing its partnerships?b. In particular, how would [the Council] evaluate SCEC’s Implementation Interface

activities?c. What new partnerships should be considered for SCEC3 to improve [the Center’s]

impact on risk reduction?d. Is there too much or too little emphasis on practical products for seismic hazard

analysis and risk reduction?

SCEC has entered into a number of promising partnerships. The new Implementation Interface provides a focus for collaborations with the earthquake engineering community, exemplified by the Pacific Earthquake Engineering Research Center (PEER)-Lifelines/SCEC/U.S. Geological Survey (USGS) New Generation Attenuation Project and the end-to-end (“Rupture to Rafters/Rivets”) simulation initiative. As part of SCEC’s Communication, Education, and Outreach Program, partnered activities include the SCEC/CUREE/IRIS Electronic Encyclopedia of Earthquakes and the web portal managed by the Earthquake Country Alliance. In general, these partnerships provide an effective means for engaging the user community and for leveraging SCEC efforts.

Nonetheless, there is more that can be done, particularly through partnerships with organizations that are now applying earthquake information. In particular, SCEC should enhance its awareness of current directions being taken by the engineering community to develop next-generation methods for performance-based design and to address the Los Angeles community’s most pressing concerns regarding seismically hazardous structures. Open synergistic partnerships with organizations whose goals include the advancement of these causes are encouraged. The partnership with PEER is a laudable example of this type of collaboration, but others should be pursued as well.

SCEC’s Geographic Scope

SCEC has a natural and appropriate focus on Southern California as a laboratory for earthquake science and hazards. Nonetheless, seismology is informed by insight gained from earthquakes throughout the globe. The Advisory Council was therefore asked:

a. What is the appropriate geographic scale for SCEC science? Should it remain a regional Center?

b. Are [SCEC’s] initiatives to form other regional partnerships an appropriate way to diversify the study of earthquake systems?

46

Page 44: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

c. Would [the Council] encourage [SCEC] to put forward an international Center-based initiative to the NSF Office of International Science and Engineering? What are the pace and selection issues associated with such an initiative?

For several reasons, the regional scale adopted by SCEC to date is still highly appropriate. Southern California remains one of the best — arguably the best — natural laboratory for earthquake science because of the spatial and temporal coverage of diverse instrumentation, the variety of fault geometries and tectonic settings, and the large population of area residents for whom improved hazard analysis will enhance safety and reduce economic vulnerability. A large fraction of SCEC members are from institutions within Southern California, which can most readily maintain local geological and seismological field programs. The enormous investment by SCEC to date in understanding the tectonics and structure of Southern California provides additional rationale to continue such a focus.

That said, ongoing SCEC initiatives to form regional partnerships are appropriate mechanisms to export SCEC products and to expand the suite of natural laboratories from which to gather information on earthquake physics. The Basin and Range working group sponsored by SCEC and the acceptance as SCEC Participating Institutions of seven foreign universities and research organizations to date are noteworthy examples of these initiatives. There are nonetheless real limits to the number and diversity of regional partnerships in which SCEC can maintain an active role at any one time. SCEC should therefore select its partnerships carefully, emphasizing those that can best advance overall SCEC goals.

Goals and Objectives for SCEC3

Plans for the next 5-year phase of SCEC received an understandably large share of attention during the Annual Meeting, as SCEC leadership and membership wrestle with the question of how best to prioritize goals for SCEC3. The Advisory Council was specifically asked to comment on the following questions:

a. What will be the goals and objectives of SCEC3? Basic science goals? New product goals (e.g., system-level models of the Southern California laboratory; hazard and risk models)? Time-dependent earthquake forecasting (earthquake prediction)? End-to-end (“rupture to rafters”) simulation? SCEC collaboratory (Community Modeling Environment)?

b. What structural changes should be made to prepare for SCEC3? How to transition WInSAR and SCIGN to the PBO era? New working groups for EarthScope and/or Tectonophysics? Move forward on international collaborations? Major new partnerships and/or other organizations? Leadership transitions?

By the end of SCEC2, several of the Center’s activities will still be underway, and some will have considerable scientific momentum. Activities during SCEC3 that harvest such momentum can therefore be anticipated. Nonetheless, to present a compelling vision for continued funding, SCEC3 should offer several new initiatives — directions in earthquake science beyond those now being pursued by the community.

47

Page 45: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Without prejudging the selection of what those initiatives should be, the Advisory Council recommends that those initiatives should satisfy several criteria. They should be sharply focused. They should be based firmly on fundamental questions in basic science. They should address goals that are achievable only by a multi-institutional Center. And those goals should be attainable within a 5-year time frame.

Several structural changes are appropriate in preparation for a transition from SCEC2 to SCEC3. SCEC’s stewardship of SCIGN and WInSAR has been critical to date, but with the onset of the EarthScope program it is now timely to transfer these activities to appropriate alternative organizations. At the same time, explicit links with EarthScope should be strengthened through one or more SCEC organizational units.

Within SCEC, a Tectonophysics Focus Group is warranted on scientific grounds; some rebalancing of assignments among working groups may be needed as a consequence. As noted above, carefully selected international collaborations make sense as a means to broaden the sweep of natural laboratories and enhance the opportunity for important lesson-forming events.

SCEC has made visible effort to promote early-career scientists to leadership positions within Center activities. This laudable effort takes optimum advantages of the energy and ideas of younger members of the community, offers opportunities to enhance the diversity of SCEC leadership, and builds a cadre of younger leaders to whom the Center and the community can turn when transitions in senior leadership are needed.

Advisory Council Issues

The structure and charter for the Advisory Council should be devised so as to provide the most effective and constructive feedback for the Center on a regular basis. Specific questions posted to the Council included the following:

a. Should [SCEC] consider new appointments or rotations, especially given the difficulty some members have had in attending AC meetings? Should [SCEC] add expertise in other areas, such as IT?

b. What should be the focus of the AC during the SCEC3 planning process? A SCEC3 proposal will require external assessments, probably at several levels. Should the AC configure a formal assessment process?

c. What should be the AC’s meeting schedule? Thus far in SCEC2, the AC has met yearly at the Annual Meeting. Should a mid-year meeting be added, which was the tradition in the early days of SCEC1?

To provide the continued infusion of fresh ideas to SCEC planning efforts, the Center should consider instituting a formal rotation of Advisory Council members. The earliest rotations should be for those members whose schedules make it difficult for them to participate in SCEC Annual Meetings.

48

Page 46: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

As new appointments are made to the Advisory Council, the expertise represented should be broadened over that of the current Council membership. Adding an expert in Information Technology should be a top priority.

The Advisory Council will assist in the preparation of the proposal for SCEC3 by providing a review of a pre-submission proposal draft. The Council is willing to add a mid-year meeting to enable such a review.

Concluding Comments

The Advisory Council is pleased to provide continued assistance to SCEC in its efforts to formulate and accomplish its major goals. The Council invites comments, criticism, and advice from the seismological community, including those both inside and outside SCEC membership, on how best to provide that assistance.

The Advisory Council looks forward to working with SCEC leadership to craft a compelling scientific and societal rationale for the continuation and expansion of SCEC activities.

2 February 2005

SCEC Advisory Council

Sean C. Solomon, Carnegie Institution of Washington (Chair)*Jeffrey T. Freymueller, University of Alaska*Raul Madariaga, Ecole Normale Supérieure, ParisJack P. Moehle, Pacific Earthquake Engineering Research Center*Farzad Naeim, John A. Martin & AssociatesGarry C. Rogers, Geological Survey of Canada*Chris Rojahn, Applied Technology Council*Haresh Shah, Risk Management Solutions, Inc.Robert B. Smith, University of Utah*Ellis M. Stanley, Sr., City of Los Angeles Emergency Preparedness DepartmentSusan Tubbesing, Earthquake Engineering Research Institute

*Participated in the Advisory Council meetings of 19-21 September 20

49

Page 47: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

2005 SCEC Communication, Education and Outreach (CEO) ProgramThe SCEC2 Communication, Education, and Outreach (CEO) program is built upon the methods, achievements, and experience of SCEC’s eleven years as an NSF Science and Technology Center, and a series of community planning workshops prior to SCEC2. These workshops led program with four long-term goals that have been pursued during SCEC2:

Coordinate productive interactions among a diverse community of SCEC scientists and with partners in science, engineering, risk management, government, business, and education (see Figure 1.A.4).

Increase earthquake knowledge and science literacy at all educational levels, including students and the general public.

Improve earthquake hazard and risk assessments Promote earthquake preparedness, mitigation, and planning for response and recovery.

CEO is well integrated within the SCEC science planning process. This includes participation of CEO staff in the development of short-term research objectives and evaluation of proposals received each year in order to develop products and services needed by our various audiences (Figure II.C.1). SCEC scientists in turn are involved in developing and fulfilling CEO short-term objectives, which are organized within four CEO focus areas: education programs and resources for students, educators, and learners of all ages; public outreach activities and products for the general public, civic and preparedness groups, and the news media; knowledge transfer activities with practicing professionals, government officials, scientists and engineers (with research partnerships coordinated within the SCEC implementation interface); and SCEC Community development activities and resources for SCEC scientists and students.

The list of activities is long and SCEC’s organizational relationships are often complex, but we emphasize that the Center’s resources, including its staff time, are carefully allocated through a prioritizationprocess that maintains good alignment between the CEO and science objectives. For example, the yearly revisions to the CEO plan are articulated within the revised SCEC Science Plan, published each October, which solicits annual proposals from the SCEC Community; the proposals that respond to the CEO solicitation are evaluated along side the science proposals in the collaboration-building process managed by the Planning Committee. This mechanism involves scientists in setting and achieving the CEO objectives.

SCEC Activity Development and Assessment process relative to the SCEC Mission. Both initial data gathering activities and integrated research results create or support a broad range of products and services. Assessments of these products and services

50

Page 48: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

SCEC CEO Activities, showing how many activities span more than one CEO focus area. Activities within the SCEC Community Development focus area are shown outside the three circles, though have connections to many of the activities shown.

SCEC CEO Team

Staff

Mark Benthien, directorJohn Marquis, digital products managerBob de Groot, education specialistSue Perry, earthquake information technology student programs manager

Student EmployeesIlene Cooper, education specialistBrion Vibber, web specialistMonica Maynard, education specialist and Spanish translator

ConsultantPaul Somerville, Implementation Interface project manager

51

Page 49: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Education ActivitiesSCEC and its expanding network of education partners are committed to fostering increasing

earthquake knowledge and science literacy at all educational levels, especially K-12 and college-level education in Earth science. In addition to activities described below, SCEC CEO also is developing an undergraduate earthquake course with new visuals and online interactive modules, revising the Seismic Sleuths earthquake curricula, supporting activities at the SCEC-developed ShakeZone museum exhibit, and working with the Los Angeles Unified School District on a sixth grade earth sciences unit which will include SCEC images and videos.Objectives

The SCEC2 objectives for the Education focus area are to (1) interest, involve and retain students in earthquake science, (2) develop innovative earth-science education resources, (3) offer effective professional development for K-12 educators.ResultsSCEC Undergraduate Internship Program. SCEC has supported over 160 students to date (including over 70 women and over 50 minority students) to work alongside over 65 SCEC scientists since 1994. The program has expanded in recent years: in 2004, SCEC supported 34 undergraduate students. SCEC interns are paid a stipend of $5000 over the summer with support from the NSF REU program. SCEC offers two summer internship programs, SCEC/SURE, and SCEC/USEIT. These programs are the principal SCEC framework for undergraduate student participation in SCEC, and have common goals of increasing diversity and retention. In addition to their research projects, participants come together several times during their internship for orientations, field trips, and to present posters at the SCEC Annual meeting. Students apply for both programs at http://www.scec.org/internships.

Each summer since 1994, the SCEC Summer Undergraduate Research Experience (SCEC/SURE) has supported students to work one-on-one as student interns with SCEC scientists. The goals of SCEC/SURE are (1) to provide hands-on experiences for undergraduates and expand student participation in the earth sciences and related disciplines, (2) to encourage students to consider careers in research and education, and (3) to interest, train, and retain talented students, including women, members of underrepresented minorities, persons with disabilities, and students outside the earth sciences. SCEC/SURE has supported students to work on numerous issues related to earthquake science including the history of earthquakes on faults, risk mitigation, seismic velocity modeling, science education, and earthquake engineering.

The SCEC Undergraduate Studies in Earthquake Information Technology (SCEC/USEIT) program, unites undergraduates from across the country in an NSF REU Site at USC. 64 students in computer science, engineering, geoscience, cinema, economics, mathematics, architecture, communications and pre-law majors have participated since Summer 2002. SCEC/USEIT interns interact in a team-oriented research environment with some of the nation's most distinguished geoscience and computer science researchers. The goals of the program are: (1) to allow undergraduates to use advanced tools of information technology to solve important problems in interdisciplinary earthquake research; (2) to close the gap between two fields of undergraduate study--computer science and geoscience; and (3) to engage non-geoscience majors in the application of earth science to the practical problems of reducing earthquake risk.

SCEC/USEIT interns have developed the "LA3D" and “SCEC-VDO” (Visual Display of Objects) visualization platforms, object-oriented, open source, and Internet-enabled systems. These tools are being used by SCEC researchers interested in displaying objects that represent the complex subsurface structure of Southern California. The interns are encoding visualization objects, creating a visual vocabulary comprising earthquake-related objects that are interconnected into a new visual ontology. In addition, the interns have built scripting capabilities into the tools, to allow the creation of visual stories that communicate the results of SCEC system-level research.

52

Page 50: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Electronic Encyclopedia of Earthquakes (E3). SCEC is developing this digital library of educational resources and information with CUREE and IRIS, with funding from the NSF National Science Digital Library (NSDL) initiative. When complete, information and resources for over 500 earth science and engineering topics will be included, with links to curricular materials useful for teaching and learning about earth science, engineering, physics and mathematics. E3 is also a valuable portal to anyone seeking up-to-date earthquake information

and authoritative technical sources, and is a platform for cross-training scientists and engineers and will provide a basis for sustained communication and resource building between major education and outreach activities. Scientists, engineers, and educators who have suggestions for content can visit www.scec.org/e3 now to complete the "Suggest a Web Page" form.

E3 is now the primary SCEC framework for presenting extensive earthquake science and engineering information, including curricular materials and technical information organized by topical areas. E3 is used to organize materials for SCEC teacher workshops, field trips, exhibits, and other SCEC activities. A sophisticated

information system for building and displaying the E3 collection and web pages has been developed, now called the SCEC Community Organized Resource Environment (SCEC/CORE). This content development and management system has now been used to create many other web and print resources, such as the main SCEC website and the new version of the Putting Down Roots in Earthquake Country brochure.

SCEC's Regional Seismicity and Geodesy Online Education Modules. These interactive online learning resources are based on seismic data from the SCEC data center, and geodetic data from the Southern California Integrated GPS Network (SCIGN). The modules are used by high school and undergraduate students and teachers, and will be integrated with the Electronic Encyclopedia of Earthquakes) (http://www.scecdc.scec.org/Module and http://scign.jpl.nasa.gov/learn). A new project is underway with Lisa Grant (UCI), Ralph Archuleta (UCSB) and Debi Kilb (Scripps) to work with SCEC staff to update functionality and content of several activities within the Seismicity module.

Seismic Sleuths Revision. SCEC is revising the AGU/FEMA Seismic Sleuths middle school earthquake curriculum to reflect advances in science and technology since the last update in 1995. The objectives are to promote and improve natural hazard education for students; to foster preparedness for natural hazards through empowerment and encouraging personal responsibility; to provide an updated and redesigned learning tool that can be easily integrated into a curriculum based on national standards; and to provide constant updates in science content, pedagogy, and resource information through an interactive website. Each unit has been streamlined and can stand-alone in print or on the Internet in order to be used in a variety of environments. In addition, a television special (Earthquakes: Seismic Sleuths) based on the series has been created and aired worldwide, made possible by funding from the California Department of Insurance, the Institute for Business and Home Safety, and SCEC. The hour-long video was first broadcast on “Assignment Discovery” in spring, 2001. The video can be used by teachers as an excellent advance organizer, or viewed by interested citizens who want to learn more about earthquakes, the destruction they can cause, the scientists and engineers who study them, and what they can do to prepare. (http://school.discovery.com/lessonplans/programs/earthquakes-gettingready/q.html)

53

Page 51: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

ShakeZone. In partnership with the Riverside County Children's Museum (“KidZone”), the CUREE-Caltech Woodframe Project and UC Riverside, SCEC created an educational, family-oriented exhibit on earthquakes ("ShakeZone") that opened in January 2002. The mission of the exhibit is to reach the local community, particularly the 20,000 elementary school children who visit KidZone each year, with positive messages about studying the Earth and preparing for earthquakes. The exhibit presents information about science, engineering, safety and mitigation. A shake table, an interactive computer display, and wall displays teach the visitors about the tools and techniques of earth scientists, engineers and emergency services personnel. (http://www.kidzone.org)

Teacher Workshops. SCEC offers teachers 2-3 full-day professional development workshops each year. The workshops provide a connection between developers of earthquake education resources and those who use these resources in the classroom. The workshops include: content and pedagogical instruction; ties to national and state science education standards; and materials teachers can take back to their classrooms. Activities include: the Dynamic Plate Puzzle; Seismic Waves with Slinkys; Brick and Sandpaper Earthquake Machine; and a Shake Table Contest. At the end of the day teachers receive an assortment of free materials provided by IRIS, including posters, maps, books, slinkys, and the binders with all the lessons from the workshop included.

In 2003 SCEC began a partnership with the SIO Visualization Center to develop teacher workshops. Facilities at the Visualization Center include a wall-sized curved panorama screen (over 10m wide). This allows the workshop participants to be literally immersed in the images being discussed. For example, when the traditional 2D maps of earthquake epicenter data were viewed in 3D, the teachers immediately understood that the faults depicted by the earthquake locations were 3D planes, not 2D lines. Two workshops have now been held with SIO, and a third is planned for summer 2005. (www.scec.org/education)USC Science Education Collaborative. Since 2003, SCEC has greatly increased engagement with the inner-city neighborhoods around USC to form various partnerships in order to improve science education and increase earthquake awareness in the local community: One of these partnerships is with USC's Joint Education Project (JEP), which sends USC

students into local schools to teach eight one-hour lessons pertaining to what they are learning in their classes. SCEC now provides educational resources to JEP students in several earth-science courses, and trains the students how to use the resources in their lessons.

Another USC-area related partnership is with the Education Consortium of Central Los Angeles (ECCLA), which funds three-week intersession programs in inner-city Los Angeles. SCEC revised and added additional materials to their existing earthquake curriculum, which was renamed “Earthquake Explorers.” SCEC also provided educational materials, and arranged guest speakers and field trips.

SCEC has also partnered with JEP, USC Mission Science, USC Sea Grants and the Jet Propulsion Laboratory (JPL) to create hands-on workshops for teachers at schools in the neighborhoods surrounding USC.

54

Page 52: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

National Association of Geoscience Teachers Far Western Section 2004 Annual Meeting. SCEC hosted this meeting with the USC Earth Science Department the last weekend of February 2004. The teachers in attendance ranged from elementary school teachers up through community college professors. A reception for the teachers began the meeting on Friday evening, which was followed by talks given by Tom Henyey and Tom Jordan, past and present directors of SCEC. On Saturday, teachers chose one of three all day field trips: Faults of Los Angeles, led by James Dolan, The Geology of the Palos Verdes Peninsula, led by Tom Henyey, and Oceanography and Coastal Geography led by Steve Lund. The meeting banquet was held Saturday evening with Lucy Jones as keynote speaker. Dr. Jones spoke about earthquake prediction, followed by a question and answer session for the teachers. On Sunday the teachers had a choice of an all day earthquake education workshop or one of three half day field trips: The La Brea Tar Pits, Southern California Integrated GPS Network, or the California Institute of Technology Seismology Lab.

Teaching Aids for University and College Level Classes: Visual Objects and QuickTime Movies [managed by Debi Kilb, UCSD/IGPP] As proposed teaching modules have been specifically designed to meet the needs of faculty members at SCEC based institutions that can be used in undergraduate and graduate classes and provide an introduction to 3D interactive exploration of data. At the 2003 SCEC meeting many of the visual

objects were previewed and netted a favorable response (12 people asked for follow up information). To date Kilb has either discussed and/or ported products to 28 people from ~20 different institutions and discussions to improve and augment these teaching tools are ongoing. Due to current space limitations only some of the end products (e.g., QuickTime movies, interactive 3D data sets, image galleries) are currently accessible through a web-based digital library interface

(http://www.siovizcenter.ucsd.edu/library.shtml) at the Visualization Center at Scripps Institution of Oceanography. There have been 550 unique visitors to these pages within the last 6 months. Plans are also in place to integrate many of the images and visual objects that we developed into the Electronic Encyclopedia of Earthquakes website (http://www.scec.org/e3/)

Teaching Aids for University and College Level Classes: Online Course development. This project is developing resources for undergraduate general education earthquake courses. Materials will include online PowerPoint files for lectures, portable demonstrations, and interactive online exercises for use in the classroom or by students at home. The online materials will be freely available to instructors at any school. The project may lead to the development of a consensus-based course that could allow interaction between students and faculty at separate institutions, with the ultimate goal of wide dissemination to the SCEC community, college and university teachers, and others.

55

Page 53: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

AssessmentEducation programs in SCEC2 have greatly expanded the Center’s ability to provide

earthquake information and resources for students and teachers. By the beginning of SCEC3, these activities will be even further developed, especially the Electronic Encyclopedia of Earthquakes.

The SCEC2 Intern programs have grown each year, and with the advent of the SCEC/UseIT program, SCEC2 has brought students to research and/or the earth sciences who had no previous interest. Also, of the 73 SCEC2 interns (SURE and USEIT combined): 29 were female; 7 were Hispanic, 1 was African American, and 1 was Pacific Islander. Of the 2004 interns, 7 were first-generation college students and 6 were from schools without research opportunities (this is the first year this information was tracked). In terms of attracting more students to the earth sciences, one student changed from an astrophysics major to a geology major, and two computer science undergraduates are now pursuing graduate degrees in geophysics. Through extensive recruitment activities in 2005 and beyond, we hope to continue to offer research opportunities to well-qualified and diverse students from around the country.

However, due to a focus on public outreach activities during the past few years (see next section), less time has been available to offer additional teacher workshops, develop as many curricular materials as originally planned, and establish partnerships with educational organizations on the same scale as our partnerships in other CEO focus areas. Building upon the resources developed in SCEC2, and expanding their geographic reach, must be a priority of the SCEC3 education effort.

Public Outreach ActivitiesThis Focus Area involves activities and products for media reporters and writers, civic

groups and the general public, and has been a high priority during SCEC2. Much of 2003 was focused on planning activities and developing products for the 10-year anniversary of the Northridge earthquake in January 2004. These activities have continued into 2005 with product revisions and continue interactions with public outreach partners. Objectives

The SCEC2 objectives for the Public Outreach Focus Area are to (1) provide useful general earthquake information, (2) develop information for the Spanish-speaking community, (3) facilitate effective media relations, and (4) promote SCEC activities.

ResultsSCEC Webservice and SCEC News. SCEC's webservice presents the research of SCEC scientists, provides links to SCEC institutions, research facilities, and databases, and serves as a resource for earthquake information, educational products, and links to other earthquake organizations. In 2000 SCEC introduced SCEC News to provide a source of information in all matters relevant to the SCEC community – to disseminate news, announcements, earthquake information, and in-depth coverage of earthquake research, in a timely manner via the World Wide Web. Since its inception in March 2000, over 1500 people have subscribed to e-mailed news "bytes" which announce new articles. (www.scec.org)

56

Page 54: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Earthquake Country Alliance. To coordinate activities for the 10-year anniversary of the Northridge Earthquake in January 2004 (and beyond), SCEC led the development of the "Earthquake Country Alliance" (ECA). This group has been organized to present common messages, to share or promote existing resources, and to develop new activities and products. The ECA includes earthquake scientists and engineers, preparedness experts, response and recovery officials, news media representatives, community leaders, and education specialists. The ECA is now the primary SCEC framework for maintaining partnerships and developing new products and services for the general public.

The ECA first met in June 2003 to begin making plans for the Northridge earthquake anniversary. This planning resulted in a complementary set of activities (planned by the ECA or by individual organizations). The ECA will continue to coordinate public awareness efforts in southern California through these and additional products and activities over the next year and beyond. In 2006, the centennial anniversary of the 1906 San Francisco earthquake will be commemorated and the Alliance will participate in educational activities and events with partners in the Bay Area.

Putting Down Roots in Earthquake Country. In 1995 SCEC, USGS, and a large group of partners developed a 32-page color handbook on earthquake science, mitigation and preparedness. For the 10-year anniversary of the Northridge earthquake, a new version was produced by SCEC and the newly-formed ECA. The updated handbook features current understanding of when and where earthquakes will occur in Southern California, how the ground will shake as a result, and descriptions of what information will be available online. The preparedness section has been completely reworked and is now organized according to the “Seven Steps on the Road to Earthquake Safety.” These steps provide a simple set of guidelines for preparing and protecting people and property. 200,000 copies were printed in January 2004, with funding from the California Earthquake Authority (CEA) and FEMA, and another 150,000 copies were printed in September 2004, with funding from CEA, USGS, Edison, Amgen, Quakehold, and others. In Spring 2005 a further revision was printed (60,000 copies) with coupons for home mitigation products. Copies of the document have been distributed at home improvement centers (on tables with preparedness products), by the American Red Cross (at neighborhood safety trainings), and by many others. The updated handbook is now at www.earthquakecountry.info.

Putting Down Roots is the principal SCEC framework for providing earthquake science, mitigation, and preparedness information to the public. The “Roots” framework extends beyond the distribution of a printed brochure and the online version. For example, the Birch Aquarium in San Diego has a new earthquake exhibit which features a “Seven Steps” display, and the Emergency Survival Program (managed by LA County) will be basing it's 2006 campaign around the “Seven Steps.” In October 2004 over 15,000 copies were included in the Earth Science Week packets distributed to science teachers and others nationwide.

The new version of Putting Down Roots was designed to allow other regions to adopt its structure and create additional versions. The first will be a Northern California version currently in development. The document will also be produced in Spanish, and versions for other regions may be created.

57

Page 55: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Earthquake Country Alliance Website. SCEC hosts this new web portal (www.earthquakecountry.info . ), which provides answers to frequently asked questions and descriptions of other resources and services that ECA members provide. The portal uses technology developed for the E3 project (see above). Each ECA member can suggest links to their organization’s resources as answers to questions listed on the site. The structure is set up very similarly to the new Putting Down Roots: sections include “what should I know?” “why should I care?” “what should I do before?” and “what should I do during and after?”

The site is set up separately from the main SCEC web pages (though has attribution to SCEC) so that all members of the ECA see the site as their own and are willing to provide content. The site features the online version of Putting Down Roots and special information pages that all groups can promote, such as a special page about the “10.5” miniseries and a page about the “Triangle of Life” theory (see assessments below).

Earthquake Country- Los Angeles. This video was produced by Dr. Pat Abbott of SDSU as the second in his “Written in Stone” series. The video tells the story of how the mountains and valleys of the Los Angeles area formed, including the important role of earthquakes. The video features aerial photography, stunning computer animations, and interviews with well-known experts. The video features 3D fault animations produced by SCEC’s “LA3D” visualization system. In addition to conducting several focus groups with teachers and preparedness experts where the video was evaluated, SCEC is also developing curricular kits for school and community groups to accompany the video. These kits will be duplicated in large quantities with funding from the California Earthquake Authority. The Los Angeles Unified School District has asked SCEC to train teachers how to use these curricular kits, and may include the video in a new sixth-grade Earth science curricula soon to be adopted district wide.

EqIP. CEO participates in the EqIP (Earthquake Information Providers) group, which connects information specialists from most earthquake-related organizations. EqIP's mission is to facilitate and improve access to earthquake information through collaboration, minimize duplication of effort by sharing information through individual personal contact, joint activities and projects, group annual meetings and biennial forums, and electronic communication. SCEC’s former CEO director was among the founding group members and managed the initial development of EqIP's website which provides a database of descriptions of over 250 organizations with links to their websites. SCEC’s current director for CEO is now the Chair of this group. (www.eqnet.org)

Media Relations. SCEC engages local, regional and national media organizations (print, radio and television) to jointly educate and inform the public about earthquake-related issuesThe goal has been to communicate clear, consistent messages to the public–both to educate and inform and to minimize misunderstandings or the perpetuation of myths. . For example, in May 2005, CEO organized a major press briefing to announce the results of a study of losses expected from a range of earthquakes on the Puente Hills fault (www.scec.org/puentehills) which received broad regional, national, and international coverage. SCEC CEO encourages scientists who are interested in conducting interviews with media reporters and writers to take advantage of short courses designed and taught by public information professionals.

58

Page 56: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Wallace Creek Interpretive Trail. In partnership with The Bureau of Land Management (BLM), SCEC designed an interpretive trail along a particularly spectacular and accessible 2 km long

stretch of the San Andreas Fault near Wallace Creek. Wallace Creek is located on the Carrizo Plain, a 3-4 hour drive north from Los Angeles. The trail opened in January 2001. The area is replete with the classic landforms produced by strike-slip faults: shutter ridges, sag ponds, simple offset stream channels, mole tracks and scarps. SCEC created the infrastructure and interpretive materials (durable signage, brochure content, and a website with additional information and directions to the trail). BLM has agreed to maintain the site and print the brochure into the foreseeable future. (www.scec.org/wallacecreek)

SCEC Publication Distribution. Copies of SCEC's field trip guides, technical reports (Phase I & II reprints, Liquefaction and Landslide Mitigation Guidelines reports, etc.), and Putting Down Roots in Earthquake Country general public handbook (see below) are widely distributed at workshops, earthquake preparedness fairs, and through the SCEC website. (www.scec.org/resources/catalog)

AssessmentThe public outreach products developed, updated, and maintained during SCEC2 represent a

new capacity for providing earthquake-related information and services. Over the next few years and into SCEC3, these resources will allow SCEC and our partners to provide continually updated information in a broad assortment of venues and mechanisms. For example, because of the ECA, a coordinated response was possible during 2004 to several public awareness threats: a mini-series about a “10.5” magnitude earthquake, a widely-reported prediction for an a 6.5 magnitude earthquake in southern California, and a mass-email campaign promoting a (dangerous) alternative to the “drop, cover, and hold on” position all preparedness groups endorse. ECA members were able to direct their audiences to a common webpage for information, rather than creating their own response. The ECA e-mail list has provided a way for members to communicate with a larger group of their peers, and meetings have brought together existing partners and new allies.

During SCEC2 the news media has become increasingly aware and interested in SCEC research and now look to SCEC as an international source of information about earthquakes. After significant earthquakes and major earthquake-related news stories, reporters from around the world call SCEC for interviews. It is essential to carefully manage SCEC’s media presence and we plan to continue to build awareness of SCEC as a media resource.

Knowledge Transfer ActivitiesThere is a widely perceived gap between basic earthquake science and its implementation in

risk mitigation (OTA, 1995). SCEC’s mission dictates that it work to close this implementation gap with engineers, emergency managers, public officials, and other users of earthquake science. The Knowledge Transfer focus area coordinates these activities.Objectives

The SCEC2 objectives for the Knowledge Transfer focus area are to (1) Engage in collaborations with earthquake engineering researchers and practitioners, (2) develop useful products and activities for practicing professionals, (3) support improved hazard and risk assessment by local government and private industry, and (4) promote effective mitigation techniques and seismic policies.

Results. Implementation Interface. A goal of SCEC2 has been to establish a closer working relationship with the earthquake engineering community that would be more effective in implementing physics-based hazard and risk analysis.

QuickTime™ and aPhoto - JPEG decompressor

are needed to see this picture.

59

Page 57: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

We therefore established a new working group, the SCEC Implementation Interface (P. Somerville, leader; R. Wesson, co-leader), as a funded component of the Center's program to promote these partnerships. It coordinates activities with all other SCEC working groups, particularly the Seismic Hazard Analysis focus group (N. Field, leader; D. Jackson, co-leader), which is responsible for developing earthquake forecasting models (with the ESP and Fault Systems groups) and intensitymeasure relationships (with the Ground Motionsgroup).The objectives of the Implementation Interface are to (1) integrate physics-based seismic hazard analysis (SHA) developed by SCEC into earthquake engineering research and practicethrough two-way knowledge transfer and collaborative research, (2) provide a flexible computational framework for system-level hazard and risk analysis through the OpenSHA platform and the Community Modeling Environment, and (3) interface SCEC research with major initiatives in earthquake engineering, such as the Next Generation Attenuation project and the NSF-sponsored George E. Brown Network for Earthquake Engineering Simulation (NEES).The first initiative was to set up a research partnership with the Pacific Earthquake Engineering Research (PEER) Center and its companion PEER-Lifelines Program. Several efforts were jointly funded by SCEC and PEER, including a large collaboration to study basin effects through wavefield modeling, led by S. Day (see Fig. 2.16), and a collaboration between A. Cornell and P. Somerville to develop vector-valued probabilistic seismic hazard analysis (VPSHA; Bazzuro and Cornell, 2002). The latter led to a novel application of VPSHA to the use of precariously balanced rocks in PSHA by Purvance et al. (2004) (see Fig. 2.20).The partnership with PEER continues to develop through the Next Generation Attenuation (NGA) Project, a major collaboration involving SCEC, the PEER-Lifelines Program, and USGS, which has been sponsored by the California Department of Transportation, the California Energy Commission, and PG&E. In its current phase, NGA-E (for empirical), SCEC scientists have used validated broadband ground motion simulation techniques to investigate features of attenuation models poorly constrained by currently available strong motion data, including rupture directivity effects, footwall vs. hanging wall effects for dipping faults, depth of faulting effects (buried vs. surface rupture), static stress drop effects, and depth to basement and basin effects. SCEC work has involved the use of results from dynamic rupture models and foam experiments to shed light on the physics of rupture directivity and shallow/deep faulting effects on strong ground motion; the development of pseudodynamic models to facilitate the representation of the physics of these phenomena in earthquake source models; and kinematic ground-motion simulations of these effects using pseudodynamic source models to guide the development of functional forms of ground-motion models representing these effects. The new set of attenuation models produced by the NGA-E project will be finalized in Spring, 2005. These models will significant change hazard estimates at short distances from seismic sources and how such estimates depend on magnitude.The activities of the Implementation Interface were broadened through a workshop held in October 2003, which identified end-to-end simulation from the earthquake source through to structural response (“rupture-to-rafters”) as a key area for SCEC collaborations with the engineering community. This idea is the focus of a major SCEC3 initiative that will involve partnerships with PEER and CUREE (§III.C.5).

60

Page 58: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

A collaboration between SCEC and the USGS has developed OpenSHA (Field et al., 2003), an open-source, object-oriented, web-enabled software integrated into the SCEC Community Modeling Environment that provides a very flexible platform for seismic hazard analysis. OpenSHA allows investigators to easily perform strong motion simulations and seismic hazard analyses, accounting for multiple earthquake potential models and multiple approaches to ground motion prediction, including physics-based simulation approaches as well as conventional attenuation relation approaches. The OpenSHA group has participated in the formal PSHA-validation exercises sponsored by the PEER-Lifelines Program, and the software is gaining wide acceptance as the platform-of-choice for PSHA calculations. Landslide Report and Workshops. In 1998, a group of geotechnical engineers and engineering geologists with academic, practicing, and regulatory backgrounds was assembled under SCEC auspices as a committee (chaired by Thomas Blake) to develop specific slope stability analysis implementation procedures to aid local southern California city and county agencies in their compliance with review requirements of the State’s Seismic Hazard Mapping Act. The work of that committee resulted in the development of a relatively detailed set of procedures for analyzing and mitigating landslide hazards in California (edited by T. Blake, R. Hollingsworth, and J. Stewart), which SCEC published in 2002 and is available on SCEC’s web site (www.scec.org/resources/catalog/hazardmitigation.html). In June 2002, over 200 geotechnical engineers, practicing geologists, government regulators and others attended a two-day SCEC workshop that explained the Landslide document. Because of the outstanding response to the sold-out workshop, a second workshop was held in February 2003 for those who were unable to attend the first. The course materials (now available for order) include extensive printed materials including all PowerPoint presentations, and two CDs with software tools and PDF files of all presentations and printed materials. As a bonus, the CD includes PDF files of the presentations given at the 1999 SCEC Liquefaction workshop and both the Landslide and Liquefaction Procedures documents. Plans are now being discussed to offer these workshops in Northern California in 2005.

HAZUS Activities. SCEC is coordinating the development and activities of the Southern California HAZUS Users Group (SoCalHUG) with the Federal Emergency Management Agency (FEMA) and the California Office of Emergency Services (OES). HAZUS (www.hazus.org) is FEMA's earthquake loss estimation software program. SoCalHUG brings together current and potential HAZUS users from industry, government, universities, and other organizations to (a)

train GIS professionals in HAZUS earthquake loss estimation software, (b) improve earthquake databases and inventories, and (c) develop and exercise emergency management protocol. SCEC is considering how it can improve the data and models that HAZUS uses in its calculations, and sees this community as an important audience for SCEC research results. SCEC CEO has organized four general meetings of the user group and several HAZUS trainings. The most recent was held in June, 2005, at SCEC headquarters at USC, with ten participants trained to be HAZUS “vendors” in the region.

The training was preceeded by a User Group meeting, and will be repeated in Fall, 2005. (www.hazus.org)

EERI Southern California Chapter. Since 2003, SCEC has hosted the bi-monthly meetings of the southern California chapter of the Earthquake Engineering Research Institute. These meetings

61

Page 59: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

include a speaker on a particular topic of interest to the attendees, typically civil, structural, and geotechnical practicing engineers. For example, on November 19, 2003, over 40 people attended a meeting with a speaker addressing new research on “Assessment and Repair of Earthquake Damage in Woodframe Construction,” and on January 19, 2005, 20 EERI members attended a briefing on the recent Sumatran earthquake and Indian Ocean Tsunami.

AssessmentMuch of the SCEC2 knowledge transfer effort to date has been focused on developing

partnerships with research and practicing engineers, and educating the users of technical products. By the end of SCEC2, new resources such as OpenSHA and the SCEC Community Modeling Environment will greatly expand the services SCEC can provide. SCEC partnerships with earthquake engineering organizations are now very strong, and we expect will continue to develop significant results through joint research projects. These results may lead to safer buildings through improved modeling of ground motions and improved engineering design to accommodate these ground motions. However, such improvements will only become implemented if building codes are updated and local governments regulate construction accordingly. To truly achieve its mission of reducing earthquake risk, SCEC must work at all levels of implementation, from basic research to enforcement of building codes at the local level.

To identify how to strengthen risk communication between SCEC and local governments, L. Grant and E. Runnerstrom of UC Irvine were supported by CEO to study the utilization of seismic hazard data and research products by cities in Orange County, CA. The study focused on evaluating the effectiveness of previous SCEC activities and products in communicating seismic risk at the municipal level. Orange County is well suited for this study because it contains diverse sociologic, geologic, and seismic conditions. In particular, the study looked at the direct use of SCEC products by local-level policy-makers and staff. By understanding the variation in the use of SCEC products, effective areas or targets within cities for risk communication should emerge. Preliminary analyses of the data suggest that SCEC products are underutilized for local planning and seismic hazard mitigation. This may be partly because of nested references within other resources that are non-exclusive to SCEC, and other use of SCEC products without direct citation. The study focused on Safety Elements and related documents (including Technical Background Reports) for Orange County’s 34 cities and found that nearly all cities in Orange County relied on planning and/or geotechnical firms to prepare technical reports or Safety Elements. Therefore, these consultants would be excellent targets for seismic risk and hazard communication by SCEC.

SCEC Community Development

The foundation of SCEC CEO is our partnerships and participation in many communities in each of the previous focus areas. Supporting the SCEC community from within is a parallel activity that bolsters our ability to reach out effectively to others. This focus area includes activities and resources relevant to SCEC scientists and students.Objectives

The SCEC2 objectives for the SCEC Community Development focus area are to (1) increase the diversity of SCEC leadership, scientists, and students, (2) facilitate communication within the SCEC Community, and (3) increase utilization of products from individual research projects.ResultsSCEC Diversity Issues and Possible Activities for a Diversity Task Force. SCEC is committed to supporting the participation of a diverse community of scientists, students, and staff and others. At the beginning of SCEC2, a Diversity Task Force of the Board of Directors was established to identify policies for increasing diversity. This Task Force began by identifying several issues:

The leadership of SCEC, including the Officers and the Board, is predominantly white and male.

The Planning Committee has significant power in SCEC2 and serves as a stepping-stone to leadership. It would be desirable for the planning committee to be significantly diverse.

62

Page 60: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Although many women and minority students are involved in intern and other programs at the undergraduate level, successively smaller numbers of women and minorities are involved at the graduate student, post doctoral, junior faculty and senior faculty levels.

SCEC is a consortium of institutions and as an organization has very little control in hiring scientists and staff, and in admitting students. Diversity goals can be encouraged but not mandated.

The current situation is not unique to SCEC, but reflects historical trends in the earth and physical science communities.

Several activities to address these issues have been identified, including improved demographic assessments of SCEC participants (for a baseline understanding of diversity in SCEC), establishing goals for increasing the numbers of women and under-represented minorities at all levels of SCEC leadership (Board, Planning Committee, etc.), and establishing policy guidelines for the selection of individuals for "stepping stone" opportunities, including speaking at SCEC meetings, and membership on SCEC committees. These activities have been implemented. For 12 years, the SCEC intern program has given research opportunities to studens with diversity as a goal, and long-term tracking shows that many of the under-represented students that participated are still in science careers.

Of the 580 participants throughout SCEC2 (some of which no longer are involved), diversity at various levels seems to reflect historical trends, with much greater diversity among students than senior faculty. In terms of gender, women account for 42% of SCEC undergraduates, 36% of graduate students, 27% of non-faculty researchers, 42% of administrative staff, and 15% of faculty researchers. SCEC has increased the representation of women on its Board of Directors (2 of 15), though board members are appointed by institutions and not selected by SCEC leadership. Three women now participate in the SCEC Planning committee, and SCEC hopes to continue to identify women within each working group willing to take on leadership roles.

Participation of under-represented minorities in SCEC also reflects general Earth science levels, and is generally much lower than preferred at this time. Overall, of the 580 SCEC2 participants, 25 are latino, 10 are Native American, 3 are black, 2 are Pacific Islander, 105 are asian, 413 are white, and 32 are unknown.

Other plans that have been discussed include the establishment of a “sounding board” (a committee of SCEC participants who could serve as informal counselors), holding an evening session at the annual meeting where diversity issues could be aired, developing a mentoring program at a variety of scales (especially at the graduate student, post doc and junior faculty levels), and identifying successful diversity practices of other large science organizations. These and other activities are being considered to continue to support the career trajectories of all members–and potential members– of the SCEC community

SCEC Community Information System (SCEC/CIS). SCEC has developed a new online database system, using technology developed as part of the Electronic Encyclopedia of Earthquakes project. This system was first implemented to facilitate registration for the 2002 SCEC Annual Meeting, and has since been used for registration for most SCEC workshops and meetings, for tracking SCEC publications, for submitting and reviewing SCEC proposals each year, maintaining demographic information, managing e-mail lists, and for providing access to contact information for each of the 750+ members of the SCEC Community. This system also allows SCEC CEO to better track research projects with potential CEO applications.

As a service for other communities associated with SCEC, similar interfaces have been developed using the same system. Such communities include the California Post Earthquake Information Clearinghouse, the Earthquake Country Alliance, the Earthquake Information Providers (EqIP), and soon others. Members of multiple communities only need to remember a single password and update their information in one location, to keep their information current for all communities.Assessment

63

Page 61: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

This is a new area of organized attention in SCEC2, and the structures and mechanisms for achieving the objectives listed above are still in development. Still, SCEC has made progress already in increasing diversity in the community, such as improved representation of women in SCEC leadership positions. The issue of diversity in the sciences extends far beyond SCEC, however since SCEC is a sufficiently large community with significant representation at the nation’s leading research institutions, there is an opportunity for SCEC to make a difference.

The objective of increasing the utilization of products from individual research projects (as opposed to products developed from overall SCEC system-level results) has not yet been sufficiently addressed. One new mechanism for promoting awareness of these projects is “SCEC Nuggets,” 1-2 page summaries basic research results that were requested for the first time in late 2004. These summaries will also allow SCEC CEO to better identify research projects with potential educational or technical products.

CEO Management Activities

Recruit CEO Advisory Panel. To expand participation by partners and recipients of SCEC CEO activities, a small advisory panel will be recruited to help review progress and provide suggestions for opportunities that might otherwise be unknown.

Develop strategic plan. Continue development of long-term strategic plan, with a focus on evaluation strategies. The CEO advisory panel will be instrumental in providing guidance for evaluation priorities. Careful assessment must be conducted at every stage of program development in order to ensure that the program can be responsive to audience needs and effective in achieving its goals:

1) Stakeholder needs assessment will determine a base level of knowledge among various audiences and identify specific needs to be addressed. This information will be gathered through document reviews and interviews with representatives of the key targets audience groups.

2) Evaluation design will consider the types of evaluation methodologies and logic models SCEC CEO will employ, based on decisions of what should be evaluated (quality and/or quantity of products? Usefulness of services? Cost-effectiveness?) and why the evaluation is needed (improve the discipline of E&O? Accountability to agency management and stakeholders? Improve service delivery and program effectiveness?)

3) Performance measurement of product development and implementation will involve collecting accountability information for stakeholders, tracking intended and unintended outcomes of the program, and providing information vital to program improvement in order to achieve pre-established goals. This information can be useful for management of activities, resources, and partnerships.

4) Programmatic assessment of the overall success in achieving SCEC’s stated goals and identification of what was successful, what failed, and why. This step is broader than performance measurement as it addresses the long-term, overall affect of the CEO program as a whole, and has implications for other large-scale E&O programs.

Represent SCEC as Member of: Network for Earthquake Engineering Simulation (NEES) E&O Committee Earthquake Information Providers (EqIP) group (Benthien is Chair) Earthquakes and Mega Cities Initiative (Los Angeles representative) Western States Seismic Policy Council California Post-Earthquake Technical Information Clearinghouse (Benthien is chair

of Information Technology workgroup) Emergency Survival Program Coordinating Council Southern California HAZUS Users Group (Benthien is project lead) EERI Southern California Chapter (SCEC hosts bimonthly meetings) EERI Mitigation Center So. Cal. Planning Committee City of Los Angeles Local Hazard Mitigation Grant Advisory Committee

64

Page 62: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

County of Los Angeles Local Hazard Mitigation Grant Advisory Committee

Document and Report on CEO activities. Each year many presentations and reports are prepared to describe the activities of the CEO program. In 2003 a paper was published in a special issue of Seismological Research Letters focused on education and outreach.

65

Page 63: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

DRAFT 2006 Program AnnouncementFOR THE SOUTHERN CALIFORNIA EARTHQUAKE CENTER

I. INTRODUCTION

On February 1, 2002, the Southern California Earthquake Center (SCEC) changed from an entity within the NSF/STC program to a free-standing center, funded by NSF/EAR and the U. S. Geological Survey. This document solicits proposals from individuals and groups to participate in the fourth year of the program.

II. GUIDELINES FOR PROPOSAL SUBMISSION

A. Due Date: November 11, 2005, 5:00 pm PST. Late proposals will not be accepted.

B. Delivery Instructions. Proposals and annual reports must be submitted as separate PDF documents via the SCEC Proposal web site at http://www.scec.org/proposals. Submission procedures, including requirements for how to name your PDF files, will be found at this web site. Please note the separate instructions for submitting science nuggets.

C. Formatting Instructions. Cover Page : Should begin with the words “2006 SCEC Proposal,” the project title,

Principal Investigator, institution, proposal categories (from types listed in Section IV, including the new SCEC Intern Support category), and the disciplinary committee(s) and focus group(s) that should consider your proposal. Indicate if the proposal should also be identified with one or more of the SCEC special projects (see Section VII) or advanced Implementation Interface projects (see Section VIII for examples). Collaborative proposals involving multiple investigators and/or institutions should list all principal investigators. Proposals do not need to be formally signed by institutional representatives, and should be for one year, with a start date of February 1, 2006.

Technical Description : Describe in five pages or fewer (including figures) the technical details of the project and how it relates to the short-term objectives outlined in the SCEC Science Plan (Section VII).

Budget Page : Budgets and budget explanations should be constructed using NSF categories. Under guidelines of the SCEC Cooperative Agreements and A-21 regulations, secretarial support and office supplies are not allowable as direct expenses. Budgeted matching funds for SCEC interns will only be awarded if a PI for the project is paired with a student intern.

Current Support : Statements of current support, following NSF guidelines, should be included for each Principal Investigator.

2004 Annual Report : Scientists funded by SCEC in 2005 must submit a report of their progress with the 2006 proposals. 2006 proposals lacking 2005 reports (which may cover 2004 to mid-year 2005 results) will neither be reviewed nor will they be considered for 2006 funding. Reports should be up to five pages of text and figures.

D. Investigator Responsibilities. Investigators are expected to interact with other SCEC scientists on a regular basis (e.g., by attending workshops and working group meetings), and contribute data, analysis results, and/or models to the appropriate SCEC data center

66

Page 64: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

(e.g., Southern California Earthquake Data Center—SCEDC), database (e.g., Fault Activity Database—FAD), or community model (e.g., Community Velocity Model—CVM). Publications resulting entirely or partially from SCEC funding must include a publication number available at http://www.scec.org/research/scecnumber/index.html. By submitting a proposal, investigators are agreeing to these conditions.

E. Eligibility. Proposals can be submitted by eligible Principal Investigators from: U.S. Academic institutions U.S. Private corporations International Institutions (funding will mainly be for travel)

F. Collaboration. Collaborative proposals with investigators from the USGS are encouraged. USGS employees should submit their requests for support through USGS channels. Collaborative proposals involving multiple investigators and/or institutions are strongly encouraged; these can be submitted with the same text, but with different institutional budgets if more than one institution is involved.

G. Award Procedures. All awards will be funded by subcontract from the University of Southern California. The Southern California Earthquake Center is funded by the National Science Foundation and the U. S. Geological Survey.

III. SCEC ORGANIZATION

A. Mission and Science Goal. SCEC is an interdisciplinary, regionally focused organization with a mission to: Gather new information about earthquakes in Southern California; Integrate this information into a comprehensive and predictive understanding of

earthquake phenomena; and Communicate this understanding to end-users and the general public in order to

increase earthquake awareness, reduce economic losses, and save lives.

SCEC’s primary science goal is to develop a comprehensive, physics-based understanding of earthquake phenomena in Southern California through integrative, multidisciplinary studies of plate-boundary tectonics, active fault systems, fault-zone processes, dynamics of fault ruptures, ground motions, and seismic hazard analysis. The long-term science goals are summarized in Appendix A.

B. Disciplinary Activities. The Center sustains disciplinary science through standing committees in seismology, geodesy, geology, and fault and rock mechanics. These committees will be responsible for planning and coordinating disciplinary activities relevant to the SCEC science plan, and they will make recommendations to the SCEC Planning Committee regarding support of disciplinary research and infrastructure. High-priority disciplinary activities are summarized in Section VII.A.

C. Interdisciplinary Focus Areas. Interdisciplinary research is organized into five science focus areas: 1) unified structural representation, 2) fault systems, 3) earthquake source physics, 4) ground motion, and 5) seismic hazard analysis. In addition, interdisciplinary research in risk assessment and mitigation will be the subject for collaborative activities between SCEC scientists and partners from other communities including earthquake engineering, risk analysis, and emergency management. High-priority activities are listed for each of these interdisciplinary focus areas in Section VII.B.

67

Page 65: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

D. Special Projects. SCEC encourages and supports several special projects including the Southern California Continental Borderland initiative and the development of an advanced IT infrastructure for system-level earthquake science in Southern California. High-priority activities are listed for each of these interdisciplinary focus areas in Section VII.C.

E. Communication, Education, and Outreach. SCEC maintains a strong Communication, Education, and Outreach (CEO) program with four principal goals: 1) coordinate productive interactions among SCEC scientists and with partners in science, engineering, risk management, government, business, and education; 2) increase earthquake knowledge and science literacy at all educational levels; 3) improve earthquake hazard and risk assessments; 4) promote earthquake preparedness, mitigation, and planning for response and recovery. Opportunities for participating in the CEO program are described in Section VIII. Current activities are described online at http://www.scec.org/ceo.

IV. PROPOSAL CATEGORIES

A. Data Gathering and Products. SCEC coordinates an interdisciplinary and multi-institutional study of earthquakes in Southern California, which requires data and derived products pertinent to the region. Proposals in this category should address the collection, archiving and distribution of data, including the production of SCEC community models that are on-line, maintained, and documented resources for making data and data products available to the scientific community.

B. Integration and Theory. SCEC supports and coordinates interpretive and theoretical investigations on earthquake problems related to the Center’s mission. Proposals in this category should be for the integration of data or data products from Category A, or for general or theoretical studies. Proposals in Categories A and B should address one or more of the goals in Section VII, and may include a brief description (<200 words) as to how the proposed research and/or its results might be used in an educational or outreach mode (see Section VII).

C. Workshops. SCEC participants who wish to host a workshop between February 2006, and February 2007, should submit a proposal for the workshop in response to this RFP. Workshops in the following topics are particularly relevant: Organizing collaborative research efforts for the five-year SCEC program (2002-

2007). In particular, interactive workshops that engage more than one focus and/or disciplinary group are strongly encouraged.

Engaging earthquake engineers and other partner and user groups in SCEC-sponsored research.

Participating in national initiatives such as EarthScope, the Advanced National Seismic System (ANSS), and the George E. Brown, Jr. Network for Earthquake Engineering Simulation (NEES).

D. Communication, Education, and Outreach. SCEC has developed a long-range CEO plan, and opportunities for participation are listed in Section VIII. Investigators who are interested in participating in this program should contact Mark Benthien (213-740-0323; [email protected]) before submitting a proposal.

E.  SCEC Intern Support.  Each year SCEC coordinates the SCEC Summer Undergraduate Research Experience (SCEC/SURE) program to support undergraduate student research with SCEC scientists.  See the SCEC Internship website at <http://www.scec.org/internships> for more information. Proposals in categories A, B,

68

Page 66: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

and D are encouraged to specify a project for a student for Summer 2006, and provide at least $2,500 of the $5,000 student stipend.  (The remainder of the stipend will be matched by NSF REU Supplement support.) The project description should include a one paragraph statement of the scientific problem, research location, intern responsibilities, necessary skills and educational preparation. Proposals selected for SCEC funding that have specified intern projects will be announced on the SCEC Internship web page (using the one paragraph statement) to allow applicants to rank their preferred projects.  If a student is not selected for a project, the funding allocated for the student will be removed before project funds are transferred to the PI.  

V. EVALUATION PROCESS AND CRITERIA

Proposals should be responsive to the RFP. A primary consideration in evaluating proposals will be how directly the proposal addresses the main objectives of SCEC. Important criteria include (not necessarily in order of priority):

Scientific merit of the proposed research Competence and performance of the investigators, especially in regard to past

SCEC-sponsored research Priority of the proposed project for short-term SCEC objectives as stated in the

RFP Promise of the proposed project for contributing to long-term SCEC goals as

reflected in the SCEC science plan (see Appendix A). Commitment of the P.I. and institution to the SCEC mission Value of the proposed research relative to its cost Ability to leverage the cost of the proposed research through other funding

sources Involvement of students and junior investigators Involvement of women and underrepresented groups Innovative or "risky" ideas that have a reasonable chance of leading to new

insights or advances in earthquake physics and/or seismic hazard analysis.

Proposals may be strengthened by describing: Collaboration

Within a disciplinary or focus group Between disciplinary and/or focus groups In modeling and/or data gathering activities With engineers, government agencies, and others. (see Section VIII,

Advanced Implementation Interface) Leveraging additional resources

From other agencies From your institution By expanding collaborations

Development and delivery of products Community research tools, models, and databases Collaborative research reports Papers in research journals End-user tools and products Workshop proceedings and CDs Fact sheets, maps, posters, public awareness brochures, etc. Educational curricula, resources, tools, etc.

Educational opportunities Graduate student research assistantships Undergraduate summer and year-round internships (funded by the project)

69

Page 67: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

K-12 educator and student activities Presentations to schools near research locations Participation in data collection

All research proposals will be evaluated by the appropriate disciplinary committees and focus groups, the Science Planning Committee, and the Center Director. CEO proposals will be evaluated by the CEO Planning Committee and the Center Director.

The Science Planning Committee is chaired by the Deputy Director and comprises the chairs of the disciplinary committees, focus groups, and special projects. It is responsible for recommending a balanced science budget to the Center Director.

The CEO Planning Committee is chaired by the Associate Director for CEO and comprises experts involved in SCEC and USGS implementation, education, and outreach. It is responsible for recommending a balanced CEO budget to the Center Director.

Recommendations of the planning committees will be combined into an annual spending plan by the Executive Committee of the SCEC Board of Directors and forwarded to the Board of Directors for approval.

Final selection of research projects will be made by the Center Director, in consultation with the Board of Directors.

The review process should be completed and applicants notified by the end of February, 2006.

VI. COORDINATION OF RESEARCH BETWEEN SCEC AND USGS-ERHP

Earthquake research in Southern California is supported both by SCEC and by the USGS Earthquake Hazards Reduction Program (EHRP). EHRP's mission is to provide the scientific information and knowledge necessary to reduce deaths, injuries, and economic losses from earthquakes. Products of this program include timely notifications of earthquake locations, size, and potential damage, regional and national assessments of earthquakes hazards, and increased understanding of the cause of earthquakes and their effects. EHRP funds research via its External Research Program, as well as work by USGS staff in its Pasadena, Menlo Park, and Golden offices. The EHRP also supports SCEC directly with $1.1M per year.

SCEC and EHRP coordinate research activities through formal means, including USGS membership on the SCEC Board of Directors and a Joint Planning Committee, and through a variety of less formal means. Interested researchers are invited to contact Dr. Lucy Jones, EHRP coordinator for Southern California, or other SCEC and EHRP staff to discuss opportunities for coordinated research.

The USGS EHRP supports a competitive, peer-reviewed, external program of research grants that enlists the talents and expertise of the academic community, State and local government, and the private sector. The investigations and activities supported through the external program are coordinated with and complement the internal USGS program efforts. This program is divided into six geographical/topical 'regions', including one specifically aimed at southern California earthquake research and others aimed at earthquake physics and effects and at probabilistic seismic hazard assessment (PSHA). The Program invites proposals that assist in achieving EHRP goals.

70

Page 68: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

The EHRP web page, http://erp-web.er.usgs.gov/, describes program priorities, projects currently funded, results from past work, and instructions for submitting proposals. The EHRP external funding cycle is several months offset from SCEC's, with the RFP due out in February and proposals due in early May. Interested PI's are encouraged to contact the USGS regional or topical coordinators for Southern California, Earthquake Physics and Effects, and/or National (PSHA) research, as listed under the "Contact Us" tab.

USGS internal earthquake research is summarized by topic at http://earthquake.usgs.gov/scitech/research/ and by project at http://earthquake.usgs.gov/research/program/. Projects of particular relevance to SCEC are described under the following titles:

Southern California Earthquake Project FOCUS on Quaternary Stratigraphy in the Los Angeles Region National Seismic Hazard Maps Earthquake Probabilities And Occurrence The Physics of Earthquakes Earthquake Effects Deformation U.S. National Strong Motion Program Earthquake Information Seismograph Networks

VII. RESEARCH OBJECTIVES

The research objectives outlined below are priorities for immediate research. They carry the expectation of substantial and measurable success during the coming year. In this context, success includes progress in building or maintaining a sustained effort to reach a long-term goal. How proposed projects address these priorities will be a major consideration in proposal evaluation, and they will set the programmatic milestones for the Center’s internal assessments. In addition to the priorities outlined below, the Center will also entertain innovative and/or "risky" ideas that may lead to new insights or major advancements in earthquake physics and/or seismic hazard analysis.

A. Disciplinary Activities

The Center will sustain disciplinary science through standing committees in seismology, geodesy, geology, and fault and rock mechanics. These committees will be responsible for planning and coordinating disciplinary activities relevant to the SCEC science plan, and they will make recommendations to the SCEC Planning Committee regarding the support of disciplinary infrastructure. High-priority disciplinary objectives include the following tasks:

1. Seismology

Data Gathering: Maintain and improve the ability of SCEC scientists to collect seismograms to further the goals of SCEC. Efforts may include: 1) Maintaining and adding to the network of borehole seismometers in order to improve resolution of earthquake source physics and the influence of the near-surface on ground motions, and 2) maintaining and upgrading a pool of portable instruments in support of targeted deployments or aftershock response.

71

Page 69: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Other possible activities include seed money for design of future experiments such as dense array measurements of basin structure and large earthquake properties, OBS deployments, and deep basement borehole studies.Data Products: Improve the ability of users to retrieve seismograms and other seismic data and enhance the usefulness of data products, such as catalogs of earthquake parameters, arrival time and polarity information, and signal-to-noise measures. An important SCEC resource is the Southern California Earthquake Data Center (SCEDC), whose continued operation is essential to deciphering Southern California earthquakes and fault structure.Enhancements to the SCEDC are encouraged that will extend its capabilities beyond routine network operations and waveform archiving, and assist researchers in using more of the data. Desirable improvements include support hardware and software enhancements, better integration with data centers in other regions, and extension of catalogs.Specific goals include: 1) Developing the ability to preview seismograms and directly load waveforms into programs, 2) Implementing software that permits access to both northern and southern California data with a single data request, and 3) Incorporating first motion and moment tensors as they become available.2. Tectonic Geodesy

Data Gathering: Support the collection of geodetic data that will improve knowledgeof crustal motion, particularly in the vertical, in areas of special interest; the proposal should explain how this improvement is likely to occur, and how the proposed measurements relate to others, both existing (the CMM and SCIGN) and planned (PBO). Measurements may include reobservations to lower errors, reobservations at sites observed only once before, or new sites. Measurements may be done with any relevant geodetic technique. Observations that will help to clarify vertical motions are especially valued. Provide support to assist in the collection of other data relevant to time-varying deformation. Provide support to assist in the operation of, and data distribution from, the WInSAR Archive.

Data Products: Continue to assimilate newly acquired GPS data into new versionsof the Crustal Motion Map, to provide better descriptions of the postseismic and coseismic motions from earthquakes, estimates of vertical motion, and a description of motions along active areas throughout California. This effort should work towards the combination of survey-mode and continuous GPS data into a seamless set of products. Support small-scale projects which use InSAR data, solely or combined with other measurements, to produce products for general use or for targeted study of special areas. Provide support to help ensure that data from all continuous GPS sites in Southern California are archived in ways that facilitate easy access, and that a unified set of time-series products are available for all such sites.

3. Earthquake Geology

72

Page 70: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Data Gathering: Plan, coordinate, and provide infrastructure for onshore and offshore geologic fieldwork, including chronologic support and shared equipment; formulate field tests of paleoseismic methodology; collect new information on fault slip rates, paleoseismic chronologies that span multiple recurrence cycles, slip in past earthquakes, and other geologic measurements of active tectonics that help resolve the current discrepancies between long-term geologic rates and GPS measurements and further our understanding of earthquake recurrence processes; coordinate fault geology studies with LiDAR data collected for the southern San Andreas fault and San Jacinto faults; develop, build and contribute new and existing data to the Working Group on California Earthquake Probabilities (WGCEP); develop methodology to test and improve resolution of event chronologies and correlations; foster subsurface analysis of fault systems, including the 3D configuration of emergent and blind thrusts and the role of off-fault deformation; compile and generate data on vertical motions to compare to geodetic (including InSAR) results. Compile existing information and conduct detailed studies of fault zone materials and structures in and adjacent to exhumed faults in order to understand deformation processes and conditions and their implications for the nucleation and propagation of earthquake ruptures, including fault zone signatures of rupture direction. Proposals should focus on studies that can be completed in the timeframe of SCEC 2, and that will yield tangible data products that contribute to our understanding of the fault system.

Data Products: Integrate field and laboratory efforts to date geologic samples and events, including standardized procedures for field documentation, sample treatment, dating methodologies, and data archiving and distribution; produce long-term rupture histories for selected fault systems in Southern California, with specific interest in the Los Angeles, Mojave, and southern San Andreas systems; address the GPS/geology discrepancy for some faults; construction of a community vertical motions map (105 year timescale).

4. Fault and Rock Mechanics

Data Gathering: Areas of FARM research include fault modeling, laboratory studies, field studies of exhumed faults, and studies of faults from drill cores. While all areas of FARM research in support of the interdisciplinary working groups will be considered, greatest emphasis will be given to research that can increase our understanding of fault behavior during dynamic earthquake slip and thereby provide useful input for models of dynamic rupture propagation. In particular, emphasis will be given to: 1) pilot studies designed to develop and test new techniques, or to develop a new facility, to measure sliding resistance of faults at seismic slip rates, 2) detailed characterizations of natural slip surfaces and the products of high-speed deformation experiments to identify the structures diagnostic of dynamic slip and to test hypotheses of dynamic weakening, 3) modeling activities to predict fault behavior during dynamic slip, especially with extreme weakening, 4) field studies geared towards developing a model of the 3D structure of a fault zone, particularly to define and quantify geometric and material property variations that influence rupture propagation, 5) laboratory, modeling, and field studies aimed at determining the maximum possible ground shaking that can result from plausible extreme behavior at the earthquake source, 6) field studies of exhumed faults aimed at detecting possible thermal signatures of paleo-earthquake slip using a variety of methods such as vitronite reflectance, and geochronometers

73

Michael Oskin, 01/03/-1,
Because we have not funded this in the past 4 yeas, there seems little point in leaving it in for the last year.
Michael Oskin, 01/03/-1,
I've dropped references to the FAD because I think it is going to quietly go away and be subsumed into the WGCEP.
Michael Oskin, 01/03/-1,
The direction stuff may need to be cut following results from Parkfield. Your call, Tom, becuase you are the one involved here.
Page 71: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

sensitive to short heat pulses, 7) developing a database for large strike-slip faults world-wide that could serve as analogs to the seismogenic depth range of the modern San Andreas fault in Southern California and that includes information about tectonic setting and history, depth of exhumation, locations, quality and extent of exposures, and an annotated bibliography for each fault including any relevant fault and rock mechanics research, 8) modeling fault behavior on the San Andreas near the EarthScope SAFOD site and collaborative studies of the structure and properties of material recovered during SAFOD drilling, and 9) cataloging of and studies of existing industry core material crossing significant faults in Southern California in order to address fault zone process questions. Also of importance, but of lower priority, is to conduct coordinated field, laboratory and theoretical studies to determine the time evolution of physical parameters during the inter-seismic period that might control the onset and characteristics of earthquake faulting. Such parameters might include those controlling fault/fluid interactions and frictional properties.

Data Products: Assess information and products from rock-mechanics experiments and fieldwork that will be most useful in SCEC studies of earthquake source physics and fault-system dynamics; develop an IT framework for an open database of experimental, model, and field results and expand upon existing databases.

B. Interdisciplinary Focus Areas

Interdisciplinary research will be organized into five science focus areas: 1) structural representation, 2) fault systems, 3) earthquake source physics, 4) ground motion, and 5) seismic hazard analysis. In addition, interdisciplinary research in risk assessment and mitigation will be the subject for collaborative activities between SCEC scientists and partners from other communities – earthquake engineering, risk analysis, and emergency management. This partnership will be managed through: 6) an implementation interface, designed to foster two-way communication and knowledge transfer between the different communities. SCEC will also sponsor a partnership in: 7) information technology, with the goal of developing an advanced IT infrastructure for system-level earthquake science in Southern California. High-priority objectives are listed for each of the five interdisciplinary focus areas below. Collaboration within and across focus areas is strongly encouraged.

1. Structural Representation

Community velocity model (CVM): Develop and implement improvements to the current SCEC velocity models, with emphasis on more accurate representations of Vs and density structure, basin shapes, and attenuation. Make the models compatible with fault positions and displacements as represented in the CFM. Evaluate the models with data (e.g., waveforms, gravity) to distinguish alternative models and quantify model uncertainties.

Community fault model (CFM): Improve and evaluate the CFM, placing emphasis on: a) defining the geometry of major faults that are incompletely, or inaccurately represented in the current model; b) producing alternative fault representations; and c) providing more detailed representations of fault terminations and linkages. Evaluate the CFM with data (e.g., seismicity) to distinguish alternative fault models.

74

Page 72: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Unified structural representation (USR): Develop a flexible delivery system for the USR and its model components. Generate volumetric meshes of the Community Block Model (CBM), and a database of USR components, including faulted horizons (as strain markers and property boundaries) that are compatible with the CBM, CVM, and CFM.

2. Fault Systems

Fault-System Behavior: Investigate the system-level architecture and behavior of fault networks to better understand the cooperative interactions that take place over a wide range of scales, assessing the ways in which the system-level behavior of faults controls seismic activity and regional deformation; infer rates of change in stress from geodetic and seismic observations; compare and interpret quantitatively short-term geodetic rates of deformation, long-term geologic rates, and rates predicted by seismicity simulators; quantify the space-time behavior of the California fault system in ways that are targeted to test models of earthquake occurrence and stress evolution; foster collaborations to obtain outside funding to support large, coordinated data-gathering efforts; determine how geologic deformation is partitioned between slip on faults and distributed off fault deformation and how geodetic strain is partitioned between long-term permanent and short-term elastic strain and on-fault slip or permanent distributed strain.

Deformation Models: Develop, validate, and facilitate use of modular 3D quasi-static codes for simulating crustal motions utilizing realistic, highly resolved geometries and rheological properties (e.g., Burgers body viscoelasticity, rate-state friction, poroelasticity, damage rheology); develop continuum representations of fault system behavior on scales smaller than can be resolved as faulting on computationally feasible meshes; develop a closed volume representation of California (including the Community Block Model—CBM) that unifies the geometric representations of CFM and the CVM and that serves as a basis for efficient meshing and remeshing of models; generate finite element meshes of the CBM; assess mechanical compatibility of CFM and how slip is transferred between recognized fault segments; develop a reference model of the time-dependent stress transfer and deformation associated with the 1992 Landers earthquake; extend models of time-dependent stress transfer and deformation of California to cover multiple earthquake cycles addressing geologic slip rates, geodetic motions (including CMM 4.0), and earthquake histories; use these models to infer fault slip state-wide (see RELM and WGCEP components of 5. Seismic Hazard Analysis), 3D rheologic structure, and fault interactions through the transfer of stresses; couple numerical models of the interseismic period to

75

Page 73: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

quasi-static full-cycle fault models to better constrain stress transfer and conditions and processes at the start of dynamic rupture, including forcing by realistic coseismic displacements and dynamic stresses (with Source Physics); develop tectonic models that explain the inferred rates of fault slip; develop a plan for post-earthquake geodetic deployments.

Seismicity Evolution Models: Determine the effects of fault system scale and resolution in models of geometrically complex fault systems; develop and validate rapid simulation methods for modeling earthquakes in fault systems over a wide range of magnitudes (with Source Physics); develop, validate, and facilitate use of codes for ensemble models simulating earthquake catalogs using CFM, USR and CBM, as well as effects of faults not included in CFM; incorporate constraints (including data assimilation) from geologic slip rates, geodetic data, realistic boundary conditions, and fault rupture parameterizations, including rate-state friction and normal stress variations; assess the processes that control the space-time-magnitude distribution of regional seismicity; quantify sources of complexity, including geometrical structure, stress transfer, fault zone heterogeneity, and slip dynamics; assess the utility of these models in forecasting Southern California earthquakes; search for statistically significant signals in the space-time- magnitude distribution of seismicity and understand their physical origin.

3. Earthquake Source Physics

Numerical Simulations Of The Earthquake Source And Earthquake Cycle

Conduct numerical simulations of dynamic rupture nucleation, propagation, and termination that include known or realistic complexity in fault geometry, material properties, stress state, and constitutive relations. Compare results with source and fault zone observations. Use this information to test hypotheses or develop new testable hypotheses about earthquake source physics. Use this information to generate earthquake scenarios, especially for Southern California (Joint with Ground Motions Group).

Explore what aspects of the source generate high-frequency waves. (Joint with Ground Motions Group)

Explore what aspects of the source and fault zone determine propagation direction (directivity).

Use numerical simulations results to guide seismic hazards analysis. (Joint with SHA Group)

76

Page 74: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Participate in the code validation exercises for 3D spontaneous rupture simulations (also Pathway 3 of the SCEC ITR) by performing benchmark tests and comparing results with the rest of the ESP and Pathway 3 community.

Bridge the interface between Earthquake Source Physics and Fault Systems by conducting physics-based fully dynamic multi-earthquake-cycle simulations, and by determining if simpler, quasi-dynamic or quasi-static simulations may suffice as a proxy for full dynamic simulations in long-term fault-systems simulations. (Joint with FS Focus Group)

Participate in NGA-H. Investigate particular problems of interest to NGA-H. (Joint with Implementation Interface Group)

Participate in YM Extreme Ground Motions Project. Investigate particular problems of interest to YM. (Joint with Implementation Interface Group)

Reference Earthquakes Building on efforts started in 2004, continue work on a database that includes

geodetic, geologic, and seismological data (and metadata), as well as models derived from them. The goal is to facilitate comparison of different models and analysis of multiple datasets. The reference earthquake database will be used for testing/validation of earthquake physics concepts and modeling techniques, and will serve as a template for additional reference earthquakes.

In-Situ Studies Of Fault-Zones (Exhumed Faults & Deep Cores) Examine and document features of fault zones in Southern California, including

the San Andreas fault system, Parkfield, and the SAFOD site, that reveal the mechanical, chemical, thermal, and kinematic processes that occur during dynamic rupture.  Include measurements and inferences of on-fault and near-fault stress, slip-zone thickness, fine-scale fault-zone geometry, adjacent damage, and fluid content at seismogenic depths. (Joint with Geology and FARM Groups)

Earthquake Scaling Determine to what extent earthquake behavior depends on earthquake size. Determine if there are breaks or trends in scaling behavior of quantities, such as

stress drop or radiated seismic energy. If so, determine how they can constrain models of the earthquake source.

Lab Studies Of The Earthquake Source (Joint with FARM Group) Carry out lab experiments on faults in rock or analog materials to determine

shear resistance at high slip speeds (on the order of 1 m/s) and stress conditions at seismogenic depths (or appropriately scaled conditions for analog materials).

Measure hydrologic properties of likely fault zone materials at high rates of deformation and fluid flow.

Conduct theoretical studies of expected behavior for possible high-speed weakening mechanisms.

77

Page 75: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Determine how changes in normal stress might affect shear resistance during dynamic rupture.

Compare results with dynamic rupture source observations. Use this information to test proposed constitutive relations or develop

improved constitutive relations. Use this information to test numerical spontaneous rupture simulations of the

earthquake source

Earthquake Interaction As An Approach To Explain Earthquake Physics Use observations of earthquake triggering or suppression to test models of

earthquake interaction and constrain the physics of earthquake rupture nucleation, propagation, and arrest.

4. Ground Motions

Broadband Ground Motion Modeling Project: Multiple groups/investigators will calculate synthetic seismograms up to 10Hz by combining deterministic and high frequency (stochastic or other) synthetics and comparing with observations. Validation of methodologies should use ground motions from pertinent earthquakes (e.g., Northridge, Landers, Hector Mine, Loma Prieta). Successful approaches will be used to extend existing 3D scenarios* to broadband by end of SCEC2, and may be used in the NGA-H Program, described in more detail in Section VIII, Part A4 –Implementation Interface Focus Area.

Inversion and CVM Testing: Use data from well-recorded earthquakes to assess wavefield simulations based on the CVM. Identify regions where CVM fails to predict ground motion. Quantify frequency range where CVM can adequately model observed waveforms. Develop methods to invert ground motion data for source and path effects, their resolution and uncertainties. Improve the S-wave velocity structure in the CVM and the Harvard model by inversion of waveform data.

SCEC Scattering and Attenuation Model: Attenuation/scattering models are to complement the SCEC CVM and be used in calculating high frequency synthetics. Develop methods/experiments to identify and model sources of scattering/attenuation in seismic body waves and coda by analyzing data from CISN and borehole instruments.

Non-Linear Site Response: Develop methods for incorporating nonlinear site response for large amplitude ground motion events in Southern California. Ideas that improve our understanding of linear site response should lead to a new understanding of how site response varies spatially. Investigate soil- (building) structure interaction and its effect on ground response including nonlinear effects.

High Frequency Wavefield: Develop strategies/experiments to separate source and path effects in high frequency wavefields. This could include empirical Green’s functions, results from the scattering model, inversion. Develop hybrid models (e.g.,

78

Page 76: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

3D+asymptotic methods, 3D+2D, 3D+1D) to include higher frequencies. Evaluate basin-edge effects.

Building Response: Develop collaborations with engineers (with IIG) to add building response to synthetic seismograms and compare with COSMOS and NGA data bases for seismograms from different floors. Evaluate the relative effects on damage of near-field acceleration and resonance excitation by long term coda. Collaborations that leverage outside funding sources for engineering analyses are desirable (e.g., PEER, MCEER, etc...).

Towards the SCEC Synthetic Catalog: Collaborate with CME to set up an internal website to compare observed seismograms from medium sized earthquakes with synthetics. This will require site effects (f, Z dependent), a scattering operator, at stations of CISN.

*A description of scenario earthquakes is posted on the SCEC website http://webwork.sdsc.edu:10081/sceclib/portal.

5. Seismic Hazard Analysis

OpenSHA: Contribute to the Community Modeling Environment for Seismic Hazard Analysis (known as OpenSHA; www.OpenSHA.org). This is an open-source, object oriented, and web enabled framework that will allow various, arbitrarily complex (e.g., physics based) earthquake rupture forecasts, ground-motion models, and engineering response measures to plug in for SHA. Part of this effort is to use information technology to enable the various models and databases they depend upon to be geographically distributed and run-time accessible. Contributions may include: 1) implementing any of the various components (in Java or other language), 2) testing any of the various components or applications, 3) extending the existing framework to enable other capabilities, such as vector-valued hazard analysis, to interface with existing risk/loss estimation tools, or to web-enable the testing of the various RELM forecast models, and 4) conducting outreach activities (e.g., workshop) with potential user groups or developing educational modules.

Regional Earthquake Likelihood Models (RELM): Via the RELM working group, develop, submit (for testing and SHA), and publish viable earthquake-forecast models for southern California or the entire state (the more physics based approaches should be developed in coordination with the Fault Systems focus group). Of particular interest are simulation methods to extend "next-event" forecasts to forecasts of all possible sequences of events. Continue the development of shared data resources needed by the RELM

79

Page 77: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

working group, especially in terms of making them on-line and machine readable. These activities should be coordinated with other SCEC focus/disciplinary groups as appropriate. Establish and implement quantitative tests of the various forecast models using observed seismicity, precarious-rock constraints, historically observed intensity levels, or other viable approaches. Conduct workshops to facilitate the various RELM activities (e.g., to establish standards for testing the models).

Contribute to the ongoing Working Group on California Earthquake Probabilities (WGCEP): We encourage contributions to the WGCEP’s effort to develop a time-dependent Uniform California Earthquake Rupture Forecast (UCERF), which is being funded in part by the California Earthquake Authority for their use in setting earthquake insurance rates. Planned innovations include relaxing assumption of persistent rupture boundaries (fault segmentation), allowing fault-to-fault jumps, the use of kinematically consistent deformation models, and the inclusion of earthquake triggering effects. The model will be deployed in an adaptable and extensible framework whereby modifications can be made as warranted by scientific developments, the collection of new data, or following the occurrence of significant earthquakes. The model will be “living” to the extent that the update and evaluation process can occur in short order. The modular design means we can use relatively simple versions of each component at first, and add more sophisticated components later. Specific information on the components and other potential contributions can be gleaned from material at http://www.relm.org/models/WGCEP (a temporary location). We specifically encourage the following: 1) compilation and analysis of paleoseismic data; 2) refinement of 3D fault models (including those in N. California); 3) contribution of a statewide deformation models or the compilation of data used therein (e.g., GPS); 4) Development of models or constraints on the long-term rate of all possible earthquakes throughout California (e.g., magnitude-area relationships, or constraints on the magnitude-frequency distribution of the region or on individual faults); 5) Application of dynamic rupture modeling to quantify the probability of fault-to-fault jumps; 6) the application of physics-based earthquake simulations to test possible approaches for defining time-dependent probabilities; 7) development of any type of viable, time-dependent probability model that can plug into our framework (e.g., renewal-type models where strict segmentation is relaxed, empirically based triggering models, or theoretically based stress-interaction models). Again, please see the above URL for details.

80

Page 78: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Improved Ground-Motion Models and Intensity-Measure Relationships: Work with the Ground Motion focus group and/or the Implementation Interface to develop improved models for predicting ground motion and/or intensity measures (empirical attenuation relationships, waveform modeling, or hybrid approaches). Of particular interest are models that can take an arbitrary earthquake rupture and site, and give back a suite of synthetic seismograms (the suite representing the propagation of all influential uncertainties). Proposals to implement new types of Intensity Measures (new functionals of ground motion, or vectors thereof) that predict engineering damage measures better than traditional intensity measures (e.g., PGA, SA) are also encouraged.

81

Page 79: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

C. Special Projects

The following are SCEC special projects with which proposals in above categories can be identified.

1. Continental Borderland (www.scec.org/borderland)

SCEC recognizes the importance of the offshore Southern California Continental Borderland in terms of understanding the tectonic evolution, active fault systems, and seismic hazard of Southern California. SCEC encourages projects that focus on the offshore region’s: 1) plate-boundary tectonics, including the currently active Pacific-North American plate motions, and its lithospheric seismic and geologic structure; 2) fault systems, including the distribution and subsurface geometry of active faults, the Quaternary rates of fault slip, and the interactions between intersecting fault systems in three dimensions with time (for example, resolving how high-angle and low-angle faults interact to accommodate long-term oblique finite strain); and 3) offshore earthquakes, including their parameters and the hazard potential of offshore geologic structures in general.

To address these issues, new methods, new datasets, and in some cases new technology may need to be developed and/or acquired. This includes the re-examination and analyses of newly released grids of industry seismic data to better quantify the location, subsurface geometry and late-Quaternary history of active offshore structures. More comprehensive detailed mapping of active offshore faults will likely require complete coverage of the Borderland with high-resolution multibeam bathymetry or other high-resolution seafloor imaging systems. Development of high-resolution techniques for conducting paleoseismology in a submarine environment will require innovative multidisciplinary techniques for imaging, sampling, and dating. Long-term monitoring of earthquake activity and geodetic strain in the Borderland will require the establishment of seafloor observatories. Such efforts may be best developed in collaboration with other disciplines (climate, oceanography, marine habitat studies, etc.), programs (EarthScope) and agencies (NOAA, NSF, NURP, etc.). SCEC wishes to encourage and endorse cooperative and collaborative projects that promote these objectives.

2. Information Technology (www.scec.org/cme)

SCEC needs to implement the tools of information technology (IT) to carry out its research agenda. A major collaboration involving SCEC scientists and IT researchers was recently funded by the NSF Information Technology Research Program to develop an advanced information infrastructure for earthquake science in Southern California (the SCEC Community Modeling Environment). The Center encourages participation by SCEC scientists in its IT activities, either directly or as part of ongoing research projects. These include: 1) defining the data structures needed to exchange information and computational results in SCEC research, including implementing these data structures via XML schema for selected computational pathways in seismic hazard analysis and ground-motion simulation; 2) developing, verifying, benchmarking, documenting, and maintaining SCEC community models; 3) developing tools for visualizing earthquake information that improve the community’s capabilities in research and education; and 4)

82

Page 80: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

organizing collections for, and contributing IT capabilities to, the Electronic Encyclopedia of Earthquakes (E3).

VIII. SCEC COMMUNICATION, EDUCATION, AND OUTREACH PLAN

SCEC is a community of over 500 scientists, students, and staff from 50 institutions across the United States, in partnership with many other science, engineering, education, and government organizations worldwide. To facilitate applications of the knowledge and scientific products developed by this community, SCEC maintains a Communication, Education, and Outreach (CEO) program with four long-term goals:

Coordinate productive interactions among a diverse community of SCEC scientists and with partners in science, engineering, risk management, government, business, and education.

Increase earthquake knowledge and science literacy at all educational levels, including students and the general public.

Improve earthquake hazard and risk assessments Promote earthquake preparedness, mitigation, and planning for response and

recovery.

Short-term objectives are outlined below. Many of these objectives present opportunities for members of the SCEC community to become involved in CEO activities, which are for the most part coordinated by CEO staff. To support the involvement of as many others as possible, budgets for proposed projects should be on the order of $2,500 to $5,000 (Implementation Interface research proposals excluded). Hence proposals that include additional sources of support (cost-sharing, funding from other organizations, etc.) are highly recommended. Those interested in submitting a CEO proposal should first contact Mark Benthien, director for CEO, at 213-740-0323 or [email protected].

A. CEO Focus Area Objectives1. SCEC Community Development and Resources (activities and resources for

SCEC scientists and students)SC1 Increase diversity of SCEC leadership, scientists, and students SC2 Facilitate communication within the SCEC CommunitySC3 Increase utilization of products from individual research projects

2. Education (programs and resources for students, educators, and learners of all ages)E1 Develop innovative earth-science education resources E2 Interest, involve and retain students in earthquake scienceE3 Offer effective professional development for K-12 educators

3. Public Outreach (activities and products for media reporters and writers, civic groups and the general public)P1 Provide useful general earthquake informationP2 Develop information for the Spanish-speaking community

P3 Facilitate effective media relationsP4 Promote SCEC activities

4. Implementation Interface (activities with engineers and other scientists, practicing professionals, risk managers, and government officials. I1 Engage in collaborations with earthquake engineering researchers and practitionersI2 Develop useful products and activities for practicing professionals

83

Page 81: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

I3 Support improved hazard and risk assessment by local government and industryI4 Promote effective mitigation techniques and seismic policies

B. Implementation Interface Program

B. Advanced Implementation Interface ProjectsThe purpose of the Implementation Interface is to implement knowledge about earthquake hazards developed by SCEC into practice. Essential to this objective is fostering collaboration between SCEC scientists and partners that are involved in research or practice in earthquake engineering, or other earthquake-related technical disciplines. Individual SCEC investigators or groups of SCEC investigators are encouraged to identify collaborative projects with individuals or groups of investigators from other organizations. SCEC investigators should request funding within SCEC Focus Groups, and describe how the project will relate to projects with partners, such as those listed in the tables below. Engineers and other potential partners should seek funding from their own organizations.

As a guide to this process, Table 1 lists potential future project topics that could involve collaboration between SCEC and earthquake engineering organizations. Table 1 also identifies potential co-sponsors of collaborative implementation-oriented work. The identification of these potential collaborative projects and potential co-sponsors does not imply a commitment on the part of these organizations to co-fund projects. These organizations have their own internal processes for reviewing and approving projects, whose schedules are not necessarily synchronous with the SCEC schedule. Accordingly, Table 1 should be viewed as a preliminary identification of potential mutual interests that could be pursued with additional discussion, and does not preclude other ideas for collaboration with these or other earthquake-related research organizations.

84

Page 82: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Table 1. Potential Advanced Implementation Interface Projects

THEME PROJECT POTENTIAL CO-SPONSORS

Ground Motion Time Histories

Provide spatial wave-field and distributed input ground motions for bridges

MAE, MCEER, PEER

Validation of simulated ground motions for performance assessment of buildings and bridges, including site effects

MAE, MCEER, PEER

Seismic Hazards

Seismic hazard and risk from aftershocks MAE, MCEER, PEER

Communication of OpenSHA to potential users EERI, SEAOC

Information Technology

Exchange information on information technologies NEES

Simulation and visualization of earthquake hazards, ground motions, geotechnical/structural response and

damage

MAE, MCEER, PEER, NEES

Ground Motion Response

Improved regional site response factors from detailed surface geology and from geotechnical borehole data bases (follow through on SCEC Phase III)

CGS, MAE,

PEER-Lifelines

Seismic velocity profiles from micro-tremor arrays for deep Vs profiles to complement SASW testing

MAE, PEER-Lifelines

Mapping of basin edge effects using geological data consistent with the engineering model developed in the SCEC/NGA-E “Basins” project

CGS, PEER-Lifelines

Relationship Between Ground Motion Characteristics and Building Response

Identify damaging characteristics of ground motions, and map associated hazard intensity measures; with feedback from engineers to seismologists identifying aspects of ground motions that are key to predicting damage

MAE, MCEER, PEER

How ground motions enter low-rise buildings MAE, MCEER, PEER

End-to-end simulations (rupture to rafters) MAE, MCEER, PEER

Societal Implications of Earthquake Hazard

Risk and implications of earthquake hazards on distributed lifeline systems and regional economies – identify the real vulnerabilities

MAE, MCEER, PEER, PEER-Lifelines

Loss Estimation Loss Estimation Methodology for evaluating societal impacts of SCEC Products such as alternative RELM fault models or alternative ground motion models

MAE, MCEER, PEER

85

Page 83: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

APPENDIX A: LONG-TERM RESEARCH GOALS

This section outlines the SCEC science priorities for the five-year period from February 1, 2002, to January 31, 2007. Additional material on the science and management plans for the Center can be found in the SCEC proposal to the NSF and USGS (http://www.scec.org/SCEC).

Long-term research goals have been formulated in six problem areas: plate-boundary tectonics, fault systems, fault-zone processes, rupture dynamics, wave propagation, and seismic hazard analysis. These goals delineate the general areas of research where substantial progress is expected during the next five years, and they provide the scientific context for the short-term objectives outlined in Section VII.

Plate-Boundary Tectonics

Goal: To determine how the relative motion between the Pacific and North American plates is distributed across Southern California, how this deformation is controlled by lithospheric architecture and rheology, and how it is changing as the plate-boundary system evolves.

Key Questions: How does the complex system of faults in Southern California accommodate the

overall plate motion? To what extent does distributed deformation (folds, pressure-solution compaction, and motions on joints, fractures and small faults) play a role within the seismogenic layer of the crust?

What lateral tractions drive the fault system? What are the directions and magnitudes of the basal tractions? How do these stresses compare with the stresses due to topography and variations in rock density? Do they vary through time?

What rheologies govern deformation in the lower crust and mantle? Is deformation beneath the seismogenic zone localized on discrete surfaces or distributed over broad regions? How are these deformations related to those within the seismogenic zone?

What is the deep structure of fault zones? Are major strike-slip faults such as the SAF truncated by décollements or do they continue through the crust? Do they offset the Moho? Are active thrust faults best described by thick-skin or thin-skin geometries?

How is the fault system in Southern California evolving over geologic time, what factors are controlling the evolution, and what influence do these changes have on the patterns of seismicity?

Fault Systems

Goal: To understand the kinematics and dynamics of the plate-boundary fault system on interseismic time scales, and to apply this understanding in constructing probabilities of earthquake occurrence in Southern California, including time-dependent earthquake forecasting.

Key Questions: What are the limits of earthquake predictability, and how are they set by fault-

system dynamics?

86

Page 84: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

How does inelastic deformation affect strain accumulation and release through the earthquake cycle? Does inelastic deformation accumulated over repeated earthquake cycles give rise to landforms and geologic structures that can be used to constrain deformation rates and structural geometries on time intervals of thousands to hundreds of thousands of years?

Are there patterns in the regional seismicity related to the past or future occurrence of large earthquakes? For example, are major ruptures on the SAF preceded by enhanced activity on secondary faults, temporal changes in b-values, or local quiescence? Can the seismicity cycles associated with large earthquakes be described in terms of repeated approaches to, and retreats from, a regional “critical point” of the fault system?

What are the statistics that describe seismic clustering in time and space, and what underlying dynamics control this episodic behavior? Is clustering observed in some fault systems due to repeated ruptures on an individual fault segment, or to rupture overlap from multiple segments? Is clustering on an individual fault related to regional clustering encompassing many faults?

What systematic differences in fault strength and behavior are attributable to the age and maturity of the fault zone, lithology of the wall rock, sense of slip, heat flow, and variation of physical properties with depth? Is the mature SAF a weak fault? If so, why? How are the details of fault-zone physics such as “critical slip distance” expressed at the system level?

To what extent do fault-zone complexities, such as bends, changes in strength, and other quenched heterogeneities control the nucleation and termination of large earthquakes and their predictability? How repeatable are large earthquakes from event to event, both in terms of location and slip distribution? How applicable are the “characteristic-earthquake” and “slip-patch” models in describing the frequency of large events? How important are dynamic cascades in determining this frequency? Do these cascades depend on the state of stress, as well as the configuration of fault segments?

How does the fault system respond to the abrupt stress changes caused by earthquakes? To what extent do the stress changes from a large earthquake advance or retard large earthquakes on adjacent faults? How does stress transfer vary with time? Does a more realistic lower-crustal rheology affect the spatial and temporal evolution of seismicity?

What controls the amplitude and time constants of the post-seismic response, including aftershock sequences and transient aseismic deformations? In particular, how important are induction of self-driven accelerating creep , fault-healing effects, poroelastic effects, and coupling of the seismogenic layer to viscoelastic flow at depth?

Fault-Zone Processes

Goal: To understand the internal structure of fault zones and the microscale processes that determine their rheologies in order to formulate more realistic macroscopic representations of fault-strength variations and the dynamic response of fault segments and fault networks.

Key Questions: Which small-scale processes—pore-water pressurization and flow, thermal

effects, geochemical alteration of minerals, solution transport effects, contact creep, microcracking and rock damage, gouge comminution and wear—are important in

87

Page 85: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

describing the earthquake cycle of nucleation, dynamic rupture, and post-seismic healing?

What fault-zone properties and processes determine velocity-weakening vs. velocity-strengthening behavior? How do these properties and processes vary with temperature, pressure, and composition? How do significant changes in normal stress modify constitutive behavior?

How does fault strength drop as slip increases immediately prior to and just after the initiation of dynamic fault rupture? Are dilatancy and fluid-flow effects important during nucleation?

What is the explanation of the discrepancy between the small values of the critical slip distance found in the laboratory (< 100 microns) and the large values (> 100 millimeters) inferred from the fracture energies of large earthquakes? What is the nature of near-fault damage and how can its effect on fault-zone rheology be parameterized?

How does fault-zone rheology depend on microscale roughness, mesoscale offsets and bends, variations in the thickness and rheology of the gouge zone, and variations in porosity and fluid pressures? Can the effects of these or other physical heterogeneities on fault friction be parameterized in phenomenological laws based on rate and state variables?

How does fault friction vary as the slip velocities increase to values as large as 1 m/s? How much is frictional weakening enhanced during high-speed slip by thermal softening at asperity contacts and by local melting?

How do faults heal? Is the dependence of large-scale fault healing on time logarithmic, as observed in the laboratory? What small-scale processes govern the healing rate, and how do they depend on temperature, stress, mineralogy, and pore-fluid chemistry?

Rupture Dynamics

Goal: To understand the physics of rupture nucleation, propagation, and arrest in realistic fault systems, and the generation of strong ground motions by earthquakes.

Key Questions: What is the magnitude of the stress needed to initiate fault rupture? Are crustal

faults “brittle” in the sense that ruptures require high stress concentrations to nucleate, but, once started, large ruptures reduce the stress to low residual levels?

How do earthquakes nucleate? What is the role of foreshocks in this process? What features characterize the early post-instability phase?

How can data on fault friction from laboratory experiments be reconciled with the earthquake energy budget observed from seismic radiation and near-fault heat flow? What is explanation of short apparent slip duration?

How much inelastic work is done outside a highly localized fault-zone core during rupture? Is the porosity of the fault zone increased by rock damage due to the passage of the rupture-tip stress concentration? What is the role of aqueous fluids in dynamic weakening and slip stabilization?

Do minor faults bordering a main fault become involved in producing unsteady rupture propagation and, potentially, in arresting the rupture? Is rupture branching an important process in controlling earthquake size and dynamic complexity?

88

Page 86: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Are strong, local variations in normal stress generated by rapid sliding on nonplanar surfaces or material contrasts across these surfaces? If so, how do they affect the energy balance during rupture?

What produces the slip heterogeneity observed in the analysis of near-field strong motion data? Does it arise from variations in mechanical properties (quenched heterogeneity) or stress fluctuations left in the wake of prior events (dynamic heterogeneity)?

Under what conditions will ruptures jump damaged zones between major fault strands? Why do many ruptures terminate at releasing step-overs? How does the current state of stress along a fault segment affect the likelihood of ruptures cascading from one segment to the next?

What are physical mechanisms for the near-field and far-field dynamical triggering of seismicity by large earthquakes?

Ground Motion

Goal: To understand seismic ground motion in urbanized Southern California well enough to predict the ground motions from specified sources at frequencies up to at least 1 Hz, and to formulate useful, consistent, stochastic models of ground motions up to at least 10 Hz.

Key Questions: How are the major variations in seismic wave speeds in Southern California

related to geologic structures? How are these structures best parameterized for the purposes of wavefield modeling?

What are the contrasts in shear-wave speed across major faults in Southern California? Are the implied variations in shear modulus significant for dynamic rupture modeling? Do these contrasts extend into the lower crust and upper mantle?

How are variations in the attenuation parameters related to wave-speed heterogeneities? Is there a significant dependence of the attenuation parameters on crustal composition or on frequency? How much of the apparent attenuation is due to scattering?

What are the differences in near-fault ground motions from reverse, strike-slip, and normal faulting? In thrust faulting, how does energy trapped between the fault plane and free surface of the hanging-wall block amplify strong ground motions?

How does the structure of sedimentary basins affect the amplitude and duration of ground shaking? How much of the amplification pattern in a basin is dependent on the location of the earthquake source? Can the structure of sedimentary basins be determined in sufficient detail to usefully predict the pattern of ground shaking for future large earthquakes?

Is the ability to model recorded seismograms limited mainly by heterogeneity in source excitation, focusing by geologic structure, or wavefield scattering?

What role do small-scale heterogeneities and irregular interfaces play in wave propagation at high frequencies? How do they depend on depth, geological formation, and tectonic structure? How important is multiple scattering in the low-velocity, uppermost layers? Can stochastic parameterizations be used to improve wavefield predictions?

Seismic Hazard Analysis

89

Page 87: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Goal: To incorporate time dependence into the framework of seismic hazard analysis in two ways: (a) through the use of rupture dynamics and wave propagation in realistic geological structures, to predict ground-motion time histories for anticipated earthquakes, and (b) through the use of fault-system analysis, to forecast the time-dependent perturbations to average earthquake probabilities in Southern California.

Key Questions: What factors limit fault-rupture propagation? How valid are the cascade and

characteristic-earthquake models? What magnitude distribution is appropriate for Southern California?

How can geodetic (GPS and InSAR) measurements of deformation be used to constrain short- and long-term seismicity rates for use in seismic hazard assessment? How can geologic and paleoseismic data on faults be used to determine earthquake recurrence rates?

What temporal models and distributions of recurrence intervals pertain to faults in Southern California? Under what circumstances are large events Poissonian in time? Can PSHA be improved by incorporating non-Poissonian distributions?

Can physics-based scenario simulations produce more accurate estimates of ground-motion parameters than standard attenuation relationships? Can these simulations be used to reduce the high residual variance in these relationships ?

What is the nature of near-fault ground motion? How do fault ruptures generate long-period directivity pulses? How do near-fault effects differ between reverse and strike-slip faulting? Can these effects be predicted?

What are the earthquake source and strong ground motion characteristics of large earthquakes (magnitudes larger then 7.5), for which there are few strong motion recordings? Can the shaking from large earthquakes be inferred from smaller events?

How does the nonlinear seismic response of soils depend on medium properties, amplitude, and frequency?

90

Page 88: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

SECTION VI

ABSTRACTS FOR INVITED TALKS

The Parkfield/Landers Reference Earthquakes Digital Library

Brad Aagaard (USGS Menlo Park), Alexei Czeskis (Purdue University)Jessica Murray (USGS Menlo Park), Anupama Venkataraman (Stanford University),

Greg Beroza (Stanford University)

Understanding the earthquake process involves integration of a wide variety of data and models from seismology, geodesy, and geology. While the geophysics community has established data centers to archive and distribute data in standard formats, earthquake models lack a similar facility. This often dramatically reduces their lifespan, because the models lack sufficient documentation and file format information to render them useful. The objective of the Reference Earthquakes Digital Library is to create the infrastructure to preserve the models in a central facility and to extend their lifespan for use in other studies. We have created a prototype Reference Earthquakes Digital Library for the 1992 Landers and 2004 Parkfield earthquakes. The digital library provides curation and ready access to archived models in standard formats with associated metadata in a common, permanent repository.

During a SCEC mini-workshop in May, 2005, we solicited input from representatives of the earthquake research community to define the metadata requirements and formats for storing 5 different model types and 3 observation types. Using tools from the San Diego Supercomputing Center, we have created user-friendly interfaces for submitting and updating models, and for searching, browsing, and downloading the models. The interface also allows assessing the extent and variety of data available in the data centers for the reference earthquakes. The current digital library prototype contains fully functional interfaces for catalogs of relocated seismicity, rupture models, and measurements of surface rupture. The library also includes a partial catalog of the metadata for seismic and geodeticinstrumentation relevant to the Landers and Parkfield earthquakes. A long-term goal for the future is to integrate the reference earthquake project into the SCEC Community Modeling Environment. We will have a demonstration of the prototype running at the poster session. We look forward to feedback and submission of models from the community that will facilitate the further development of the Reference Earthquakes Digital Library.

91

Page 89: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Constraining Extreme Ground Motions in Seismic Hazard Analyses

Norman Abrahamson (Pacific Gas & Electric Company)

In standard Probabilistic Seismic Hazard Analysis (PSHA), as the hazard estimate is extended to very low probabilities, the ground motion continues to increase due to the variability of the ground motion. At very low probabilities, the PSHA can result in extreme ground motions that are so large that they are thought to be physically unrealizable by many seismologists. The difficultly is defining a quantitative limit on the ground motion that has well founded technical basis.

To address this issue, a Cooperative Agreement between DOE and PG&E is under preparation. This project will be a collaborative effort between a number of Federal and State agencies, universities and private businesses to perform research and development activities to constrain extreme ground motions in PSHA. The research to be performed is placed in four categories: (A) physical limits, (B) unexceeded values, (C) event frequencies, and (D) PSHA implementation. The results of this study will be applicable to the PSHA for Yucca Mountain and other long-lived facilities.

Physical limits are the largest ground motions that could ever occur. These ground motion limits results from two factors: (1) the limits of the ground motion that can be propagated through the rock due to non-linear effects including slip on pre-existing fractures and creation of new fractures along the travel path of the seismic wave as they propagate from the fault rupture to the ground surface, and (2) the limits of the ground motion that can be generated by the faults due to non-linear effects at the fault including damage of the rock in the source region.

The unexceeded ground motions are ground motions that have not happened at a specific site for a specific time interval. They are based on the presence of geologic structures that would not exist if large ground motions had occurred at the site since the geologic structure was formed. The unexceeded ground motion depends on the sensitivity of the geologic structure to ground motion and the time since the geologic structure was formed.

Event frequencies of occurrence addresses how often very large ground motions will occur. In contrast to the physical limits, which addressed the absolute limit of the ground motion that can be generated by the source, research in this task will estimate the probability of extreme ground motions being generated by the source.

The physical limits, unexceeded ground motions, and event frequencies of occurrence will provide important constraints on the seismic hazard at Yucca Mountain. They can be used in either a new PSHA calculation or to revise the 1998 PSHA by down-weighting or rejecting branches of the logic tree that lead to hazard curves inconsistent with the results of this research. Some question yet remains just how to put apply these constraints in the PSHA format, particularly for the unexceeded ground motions. Procedures and guidelines will be developed for the implementation of the research in the PSHA process.

92

Page 90: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Multi-scale shear velocity anisotropy at the SAFOD site

Naomi Boness (Stanford) and Mark Zoback (Stanford)

The region surrounding the San Andreas Fault Observatory at Depth (SAFOD) near Parkfield, CA is an ideal location to study the physical processes controlling shear velocity anisotropy in the brittle crust. We are able to utilize a suite of geophysical logs from the SAFOD boreholes, earthquake data recorded on the Pilot Hole array, the Northern California Seismic Network (NCSN) and the Southern California Seismic Network (SCSN). In addition, because the direction of maximum horizontal compression is at a high angle to the predominantly northwest-southeast structural trend, it is easy to distinguish stress-induced from structurally-induced velocity anisotropy. Dipole sonic logs in the SAFOD boreholes indicate that the shear-wave velocity anisotropy of the granitic rocks surrounding the wellbore is on the order of 3 to 10% and controlled by the tectonic stress field. The amount of stress-induced velocity in the granite decreases with depth, a result of the fact that as confining pressure increases, seismic velocity becomes less stress sensitive. However, within the sedimentary section found at depth both stress-induced and structural velocity anisotropy is observed in the dipole sonic logs. At depth intervals where there are well-developed bedding planes, dipole sonic logs indicate that structurally-induced anisotropy is dominant. An analysis of earthquake seismograms shows that ray paths through the Salinian granite adjacent to the fault exhibit fast shear wave polarizations aligned with the direction of maximum horizontal compression (in agreement with pilot hole stress measurements). In contrast, ray paths along the San Andreas Fault (or through fault-parallel sedimentary structures) yield fast directions consistent with the northwest-southeast structural trend. An analysis of shear waves from local crustal earthquakes recorded at regional seismic stations in California shows a similar signature, indicating that seismic anisotropy may be useful in mapping the crustal stress field. We conclude that within the San Andreas Fault Zone, the structural fabric is the dominant mechanism responsible for velocity anisotropy whereas crust that does not have a predominant structural trend, the direction of maximum horizontal compression is the most important controlling factor.

93

Page 91: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Imaging of Active Seismogenic Faults with Space Geodesy

Yuri Fialko (UCSD)

Theoretical models of seismic rupture on major crustal faults predict significant yielding and cracking of host rocks around the primary fault slip surface. Multiple generations of ruptures result in a formation of a layer of damaged rocks which may have mechanical properties that are substantially different from those of the ambient crust. Direct observational constraints on the presence, amount, and extent of the earthquake-induced damage include geological mapping of the crack density away from the slip surface, and seismologic observations of the low velocity zones along large crustal faults. Recently, macroscopic damage zones around several seismogenic faults were imaged with space geodetic observations. I will present examples from the Landers and Hector Mine (southern California) and Izmit (Turkey) earthquakes. The imaged fault zones are associated with faults in the vicinity of earthquake ruptures that were strained, but not broken by stressing from a nearby earthquake, resulting in centimeter-scale variations in surface displacements. Finite element modeling of surface deformation data indicates 1-2 kilometer-wide damage zones with a reduction in the effective elastic modulus of about a factor of two. The depth extent of a low rigidity material is at least 3-4 km, and possibly spans the entire brittle layer. Preliminary seismic observations have confirmed the existence of a wide low velocity zone in the uppermost crust around the Calico fault, one of the example faults for which geodetic data suggests a massive compliant structure. A more detailed seismic experiment is currently underway to probe the deep structure of the Calico and Pinto Mountain faults in the Eastern California Shear Zone. Accurate and spatially dense geodetic observations also begin to reveal lateral variations in rock rigidity on a crustal scale - in particular, due to dissimilar rocks on different sides of major crustal faults. I will present measurements of interseismic strain accumulation on the southern San Andreas fault system which suggest a substantial (a factor of two to three, or perhaps greater) increase in the effective elastic moduli across the San Andreas and San Jacinto faults. These results suggest that the static and dynamic elastic moduli of rocks may systematically differ not only in an absolute, but also in a relative sense.

Structural versus Nonstructural Seismic Response to Ground Motion Ensembles

Tara Hutchinson (UCI)

Although the response of structural systems to an ensemble of hazard-level binned ground motions has become accepted as a design practice approach, this is not the case for nonstructural systems. For example, simplified measures such as peak horizontal floor acceleration (PHFA) are now widely used by the engineering community for estimating the vulnerability of both attached and unattached acceleration sensitive nonstructural elements and systems. This talk will explore the ramifications of both nonstructural and structural seismic response to ground motion ensembles and look at simplified design methods to encompass the range of uncertainty in response in a practical manner.

94

Page 92: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Recent Discoveries from the Taiwan Chelungpu-Fault Drilling Project

Kuo-Fong Ma(Institute of Geophysics, National Central University, Taiwan, ROC)

Taiwan Chelungpu-Fault Drilling Project (TCDP) drilled two holes 40 m apart (a continuous coring of 500m to 2000m in hole-A, and 950m to 1300 m in hole-B) into the northern portion of the recently ruptured large slip zone of the Chelungpu fault. The principal slip zone, PSZ, was identified at the depths of 1111.23 to 1111.35 m in hole-A, and a corresponding feature was identified in hole-B at the depths of 1136.50 to 1136.62 m, consistent with geological observations at the surface. The vertical extent of the damage zone relative to the fault core is asymmetric, with a greater thickness above the fault core. The PSZ is about 12 cm thick and located near the lower boundary of the damage zone and contains subsidiary cataclastic fracturing. The geophysical logging showed the measurements of low seismic velocities and low electrical resistivity in the vicinity of the identified PSZ. Along with the orientation of observed fractures, the geophysical features support that the deformed zone at 1111m is the primary fault zone.

The actual grain sizes from the gouge layer of the core were measured from transmission electron microscopy (TEM) and shown to have diameters of 10 nm to 300 nm. The analysis on clay minerals shows rich content of smectite and disappearance of kaolinite in the PSZ, suggesting the fresh formation of the PSZ, and a possible temperature rise of at least 400C. The seismic fracture energy calculated from the dense strong motion waveforms for the fault block beneath the drill site is 11.6 MJ/m2. It yields the radiated efficiency of 0.25. If there was only a fraction () of the seismic fracture energy contributes to the creation of the gouge, the number of repeated earthquakes in the PSZ could be 10, 100, and 1000, respectively, for of 1, 0.1, and 0.01. The corresponding values on the ratio of gouge thickness to the accumulated slips (T/D) would be 1.44x10-3, 1.44x10-4, and 1.44x10-5 if we consider the slip of 8.3 m as the Chi-Chi earthquake for every repeated earthquake. The former two values of T/D are comparable to the geological data. It’s likely that the number of repeated earthquakes in the PSZ was formed only by tens of events. It might suggest that the earthquakes were not repeatedly occurred on the same slip zone in the entire history of the fault. Comparing to other observations in various faults, our results suggest that the amount of energy needed to form a fault zone probably depends on the maturity and style of faulting.

95

Page 93: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Overturning of Slender Blocks: Numerical Investigation and Application to Precariously Balanced Rocks in Southern California

Matthew Purvance (UNR)

Existing precariously balanced rocks near to active faults provide a new source of data constraining ground motions over long time periods. In order to quantify these constraints, a set of synthetic blocks have been exposed to many different ground motion time histories. The overturning response of a given block depends critically on it geometry along with attributes of the input ground motion to which it has been exposed. In particular, block rocking initiates due to the amplitude of the high-frequency motions (e.g. PGA) while subsequent overturning is related to the amplitude of lower frequency motions (e.g. PGV, SA at 1 sec, SA at 2 sec). Comparisons between shake table experiments and overturning predictions are presented for validation. A methodology has been developed to utilize this information in order to test prospective ground motion models. Precariously balanced rocks along the Mojave section of the San Andreas fault and between the Elsinore and San Jacinto faults have been tested for consistency with vector-valued PSHA calculations. Ground motion models utilizing the attenuation relation of Abrahamson and Silva (1997) are largely inconsistent with these rocks. Ground motion models calculated via the NGA attenuation relation of Abrahamson (2005) are more consistent with precariously balanced rocks existing at these sites. This new source of data may contribute fundamentally to our understanding to

long term seismic hazard in southern California.

96

Page 94: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

ITR SESSION ABSTRACTSCEC Community Modeling Environment (SCEC/CME)

A Collaboratory for Seismic Hazard Analysis

The goal of the SCEC/CME project is to develop an integrated environment in which a broad user community encompassing geoscientists, civil and structural engineers, educators, city planners, and disaster response teams can have access to powerful physics-based simulation techniques for seismic hazard analysis (SHA). To achieve this goal, SCEC scientists and collaborating computer scientists are developing a computational environment that allows researchers to describe, configure, instantiate, and execute complex computational pathways that include various earthquake simulation models.

In the first session, researchers working on the SCEC/CME project will describe several Project accomplishments including seismic hazard analysis research being performed using the system, large-scale simulations capabilities, computer science technologies integrated into the system, and developments that may provided new capabilities to SCEC scientists. In the first session, Project researchers will discuss how developments on the SCEC/CME system are transforming the practice of seismic hazard analysis. The OpenSHA tools, developed collaboratively by SCEC/CME researchers and USGS researchers, are a component-based software system for seismic hazard analysis. OpenSHA provides researchers with extensive new SHA computational capabilities, and in this session, we will discuss several ways in which OpenSHA has been used in SHA research. In addition, project researchers will discuss the status of the SCEC/CME CyberShake effort that is working to calculate hazard curves using 3D ground motion modeling rather than attenuation relationships.

In the second session, Project researchers will discuss the TeraShake computational platform. A series of large earthquake simulations, called the TeraShake simulations, were run by Project participants. In this session, we discuss the scientific results of these simulations. The geophysical models and high performance simulation software used in the TeraShake simulations are now being adopted as a computing “platform” that can be used in other similar studies. We will discuss how the TeraShake software platform is used as a part of the TeraShake 2 simulations. The TeraShake 2 simulations combine dynamic rupture simulations and wave propagation simulations.

In the third session, Project researchers will describe computer science technologies that are integrated into the SCEC/CME system. Researchers will discuss knowledge-based tools developed on the Project that help users assemble complex series of calculations as scientific workflows. These tools guide users through the construction of scientific workflows and validate workflows once they are constructed. We will also describe the grid-based computing architecture and grid-based workflow system used to access the substantial computing and storage resources needed by SHA computations.

In the fourth session, Project researchers will discuss several future directions for SCEC/CME research and development. The scientific questions SCEC researchers are posing

97

Page 95: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

require highly-scalable, high performance, computing techniques. We will discuss computing techniques being developed on the Project that may help us meet these high performance computing requirements. Also, SCEC scientists continue to call for new capabilities from geophysical simulation codes. Project scientists will discuss simulation codes under-development that provide new capabilities including support for higher frequencies, topography, dynamic ruptures simulations coupled to wave propagation simulations, Q, 3D structure, dipping faults, and curved faults

98

Page 96: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

2005 SCEC MEETNG ABSTRACTS

Revisiting the 1906 San Francisco Earthquake: Ground Motions in the Bay Area from Large Earthquakes on the San Andreas Fault

Aagaard, Brad (USGS Menlo Park)

3-D simulations of long-period ground motions resulting from large ruptures of the San Andreas fault in the San Francisco Bay area are one component of the SF06 simulation project (www.sf06simulation.org). As one of five groups involved in the 3-D ground motion modeling, I am creating kinematic and dynamic (spontaneous) rupture simulations of the M6.9 1989 Loma Prieta earthquake and M7.8 1906-like events on the San Andreas fault. The simulations of the Loma Prieta earthquake serve as a means to demonstrate that the source model and finite-element discretization of the geologic model produce ground motions that are similar to recorded motions. The simulations of large events on the San Andreas fault will indicate what long-period ground motions may have been produced by the 1906 earthquake as well as the ground motions that might be expected in future, similarly sized events with different hypocenter locations.

My finite-element model encompasses a 250 km x 110 km x 40 km portion of the San Francisco Bay area and is designed for modeling wave propagation at periods of 2.0 sec and longer. The geologic structure, including the topography, fault surfaces, and material properties, are defined by the USGS Bay Area Velocity Model 05.0.0 (see Brocher et al.). The Loma Prieta simulations attempt to reproduce the recorded long-period period shaking using both kinematic and dynamic rupture source models. One large San Andreas scenario aims to closely match the 1906 event (similar hypocenter and distribution of slip), while the others examine the effects of an epicenter near Santa Rosa and an epicenter near San Juan Batista. As part of the SF06 Simulation Project, the long-period motions will be combined with short-period motions to create broadband ground motions, which will be archived for future use, such as earthquake engineering studies of the response of structures to strong ground motions.

Possible Triggered Aseismic Slip on the San Jacinto FaultAgnew, Duncan (UCSD) and Frank Wyatt (UCSD)

We report evidence for deep aseismic slip following a recent earthquake on the San Jacinto fault (12 June 2005, 15:41:46.27, or 2005:163.654), based on data from long-base strainmeters at Pinon Flat Observatory (PFO). This magnitude 5.2 shock occurred within a seismic slip gap, but in in a region of abundant small and moderate earthquakes that lie to the SE of a 15-km section of fault that is relatively aseismic (a seismicity gap). This earthquake has been followed by a normally decaying aftershock sequence from a volume commensurate with the likely rupture zone. However, it also triggered an increase of seismicity along the fault zone NW of the epicenter, in the seismicity gap. We have observed changes in strain rate at PFO that strongly support slip having occurred over the days following the earthquake. Two strain records (from the NS and EW instruments) show a clear strain change over the seven days after the earthquake, in equal and opposite senses. The NW-SE strainmeter shows no response until about a week after the earthquake. These signals are consistent with with slip in the region of the triggered earthquakes, followed by slip further to the NW. The moment release inferred depends on the depth, which is not well constrained; if the slip is colocated with the seismicity, the aseismic moment release is equivalent to a magnitude 5.0 event, close to the mainshock moment.

99

Page 97: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Constraints on Ruptures along the San Andreas Fault in the Carrizo Plain: Initial Results from 2005 Bidart Fan Site Excavations

Akciz, Sinan (UC Irvine), Lisa B. Grant (UC Irvine), J. Ramon Arrowsmith (ASU), Olaf Zielke (ASU), Nathan A. Toke (ASU), Gabriela Noriega (UC Irvine),

Emily Starke (UTulsa/SCEC), and Jeff Cornoyer (ASU)

Paleoseismic data on the rupture history of the San Andreas Fault (SAF) form the basis of numerous models of fault behavior and seismic hazard. The Carrizo segment of the SAF is one of the best places to study the rupture history of the SAF because it has a proven paleoseismic record with excellent slip rate and slip per event measurements. We conducted a paleoseismic study along the San Andreas fault at the Bidart Fan site to lengthen and refine the record of recent surface ruptures. Previous work at the Bidart Fan site (Grant and Sieh, 1994) demonstrated that is an excellent place to develop a long chronology of earthquakes because: (1) It has good stratigraphy for discriminating individual earthquakes. (2) It has datable material. Detrital charcoal and other datable organic material is commonly embedded in the deposits. During the 2005 field season we excavated and logged two 11-foot-deep trenches perpendicular to the SAF (BDT5 and BDT6) and collected 125 samples for radiocarbon dating. We here present the BDT5 trench log and our preliminary interpretation of the 6+ events. Age control is based on radiocarbon ages of detrital-charcoal samples. Our best 30 charcoal samples from BDT5, which should help us constrain the ages of the 4 surface rupturing events prior to the penultimate earthquake, are currently being processed at UCI’s new Keck AMS facility. A longer record of surface ruptures at the Bidart Fan site will be helpful for correlating ruptures between the Carrizo Plain and sites on the adjacent Mojave and Cholame segments and therefore estimating the magnitude of earthquakes previously documented at other sites.

SCEC/UseIT: City Search and DisplayAkullian, Kristy (USC)

Upon acceptance to the UseIT program, incoming interns received a rather cryptic email regarding their work in the program. Entitled the Grand Challenge, the email was accompanied by sundry disturbing disclaimers such as, "As you read the Challenge, you may not understand much of it, and you may have no idea how to proceed." But if our mentors were deliberately nebulous in their preliminary directions, it was because they understood the scope and complexity of the project we were being asked to undertake: the creation of an Earthquake Monitoring System using the 3D intern generated program, SCEC-VDO.

The first few weeks of our summer experience was almost dizzying in its fast-paced conveyance of the working knowledge we would need in multiple fields of study. Entering the program as one of two interns with no programming experience, I quickly set to work learning the fundamentals of the Java programming language. Early in the summer I took on a project intended to allow the user greater flexibility in the selection of California cities displayed. As the summer progressed the project evolved to include the many "bells and whistles" it now entails. Working collaboratively with my colleagues, I expanded a primitive Label Plug-in to include collections of cities, populations, SCEC Institutions, intern schools and even a search function to display a city of particular significance. These additions to SCEC-VDO will allow the end user a broader spatial reference and easier navigation within the program, as well as a heightened sense of the social consequences an earthquake in any given California location would entail.

100

Page 98: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Stress Drop Variations in the Parkfield Segment of the SAF from Earthquake Source Spectra

Allmann, Bettina (UCSD) and Peter Shearer (UCSD)

We analyze P-wave spectra from 34316 waveforms of earthquakes that occurred between 1984 and June 2005 on the San Andreas Fault in the vicinity of Parkfield, CA. We focus our analysis on a 70 km segment of the fault that ranges from the southernmost part of the creeping section over the Middle Mountain region beneath the M6.0 1966 hypocenter into the rupture zone of the M6.0 2004 Parkfield event. We apply a method that isolates source, receiver and path dependent terms, and we correct the resulting source spectra for attenuation using an empirical Green's function method. In order to determine earthquake corner frequencies, we assume a Madariaga-type source model with a best-fitting falloff rate of 1.6. This analysis results in stress drop estimates for about 3700 events with local magnitudes between 0.9 and 2.9. We observe a variation of median stress drop with hypocenter depth from about 0.5 MPa at 2 km depth to about 10 MPa at 14 km depth. We see no correlation of stress drop with estimated moment magnitude. When plotting median stress drops taken over a fixed number of events, we observe significant lateral and temporal variations in estimated stress drops. The creeping section north of the 1966 main shock shows generally lower stress drop values than the area to the south. Anomalously high stress drop values are observed in a 10 km wide area below the 1966 Parkfield main shock. We associate this area with the Middle Mountain asperity in which anomalously low b-values have been observed. South of the San Andreas Fault Observatory at Depth (SAFOD), aftershocks of the 2004 M6.0 earthquake have reduced high-frequency amplitudes, on average, compared to earlier events in the same region, suggesting either lower stress drops or increased attenuation in the months following the mainshock. In contrast, we observe a slight increase in apparent stress drop within the creeping section north of SAFOD. After the 2004 event, the Middle Mountain asperity persists as a high relative stress drop anomaly, but stress drop estimates within the northern part of the asperity are reduced. Finally, we compare our spatial variations in estimated stress drop with preliminary slip models for the 2004 Parkfield main shock.

Geometrical Complexity of Natural Faults and Scaling of Earthquake Fracture Energy

Ando, Ryosuke (Columbia) and Teruo Yamashita (ERI, Univ of Tokyo)

Based on macroscopic geological observations, natural faults are never planar planes and show complex geometry composed by bends, step-overs and branches. Also much finer observations of the faults show that the faults are never infinitely thin planes but they have internal structures constituted by thin principle slip planes and widely distributed damage zones. In this paper, we first focus on the formation process of the damage zones regarding the geometry of the branches, and then, demonstrate that the scaling of earthquake fracture energy could be related to the geometry of the fault damage zone. In order to model this hierarchy structure of the faults, we newly construct the multi-scale earthquake rupture model introducing the geometrical structure in the mesoscopic scale between the microscopic and macroscopic scales. Assuming homogeneous distribution of the initial stress and the residual stress, we obtain the self-similar geometry of the fault damage zone composed by array of the secondary branches if the main fault length L is shorter than a critical length Lm. This self-similar geometry leads the scaling of earthquake fracture energy in the macroscopic scale Gc as Gc being proportional to L. However, it is also shown that once L exceeds Lm this self-similarity is no longer satisfied. The existence of Lm points out the existence of the limits of validity in phenomenological model of the fault damage zone and the fracture energy.

101

Page 99: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Additional Measurements of Toppling Directions for Precarious Rocks in Southern California

Anooshehpoor, Rasool (UNR), Matthew Purvance (UNR), and James Brune (UNR)

At last year’s SCEC meeting we reported that for the spectacular line of precariously balanced rocks observed between the San Jacinto and Elsinore faults, the rocks tend to be sensitive to rocking motion in nearly the same direction, toppling most easily for ground motions perpendicular to the strike of the faults. The high number of these rocks was unexpected, since current estimations are that long strike-slip sub-shear rupture velocities produce strong fault-perpendicular motions, and might have been expected to topple the rocks.

However, a simple interpretation, possibly in terms of super-shear ruptures or predominant mode III ruptures, is complicated by a geologically obvious preferred fracture direction parallel to the two faults. Hence the asymmetric fracture orientation or asymmetric strong ground motions, or a combination of the two, could produce the interesting distribution. The findings are consistent with highest particle velocities occurring in the fault parallel direction for at least some earthquakes in the last several thousand years, knocking down the rocks sensitive to fault-parallel ground motions, but further checking was needed.

Here we report the results of additional surveys for precarious rock orientations,- more observations between the Elsinore and San Jacinto faults, some between the San Andreas and San Jacinto faults, a number at Lovejoy Buttes and Victorville in the Mojave Desert (15 and 35 km from the San Andreas fault, respectively), and a number in the Granite Pass eroded pediment east of the Eastern California Shear Zone, where we would expect the rocks to have been relatively unaffected by earthquakes (as a base case). In total we measured orientations for about 40 additional rocks.

Preliminary conclusions are: (1) New measurements between the San Jacinto and Elsinore faults confirm a predominance of rocks sensitive to fault perpendicular motions, but we have not eliminated the possibility of control by structural grain, (2) Rocks between the San Andreas and San Jacinto faults have a broader azimuthal distribution, but still with a predominant toppling direction perpendicular to the two faults, and possibly some control by structural grain, (3) Rocks at Lovejoy Buttes have an even broader distribution of toppling azimuths and some control by the underlying fracture grain, and (4) Rocks at Granite Pediment, far removed from currently active faults, have a relatively random distribution of toppling directions and underlying fracture grain.

We conclude that, although structural fracture grain is obviously playing a significant role in determining rock orientations in many cases, there still seems to be an unexpected lack of rocks sensitive to fault parallel ground motions. This might be caused by strong fault-parallel ground motions in some geologically recent earthquakes, possibly a result of super shear rupture velocities or a predominance of Mode III ruptures.

Current Development at the Southern California Earthquake Data Center (SCEDC)

Appel, Vikki, Marie-Odile Stotzer, Ellen Yu, Shang-Lin Chen, and Robert Clayton (Caltech)

Over the past year, the SCEDC completed or is near completion of three featured projects:

Station Information System (SIS) DevelopmentThe SIS will provide users with an interface into complete and accurate station metadata for all current and historic data at the SCEDC. The goal of this project is to develop a system that can interact with a single database source to enter, update and retrieve station metadata easily and

102

Page 100: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

efficiently. The scope of the system is to develop and implement a simplified metadata information system with the following capabilities:

• Provide accurate station/channel information for active stations to the SCSN real-time processing system.• Provide accurate station/channel information for active and historic stations that have parametric data at the SCEDC i.e., for users retrieving data via STP from the SCEDC.• Provide all necessary information to generate dataless SEED volumes for active and historic stations that have data at the SCEDC.• Provide all necessary information to generate COSMOS V0 metadata.• Be updatable through a graphical interface that is designed to minimize editing mistakes.• Allow stations to be added to the system with a minimum, but incomplete set of information using predefined defaults that can be easily updated as more information becomes available. This aspect of the system becomes increasingly important with historic data when some aspects of the meta-data are simply not known.• Facilitate statewide metadata exchange for both real-time processing and provide a common approach to CISN historic station metadata.

Moment Tensor SolutionsThe SCEDC is currently archiving and delivering Moment Magnitudes and Moment Tensor Solutions (MTS) produced by the SCSN in real-time and post-processing solutions for events spanning back to 1999 from http://www.data.scec.org/catalog_search/CMTsearch.php.

The automatic MTS runs on all local events with Ml>3.0, and all regional events with Ml>=3.5 identified by the SCSN real-time system. The distributed solution automatically creates links from all USGS Simpson Maps to a text e-mail summary solution, creates a .gif image of the solution, and updates the moment tensor database tables at the SCEDC. The solution can also be modified using an interactive web interface, and re-distributed. The SCSN Moment Tensor Real Time Solution is based on the method developed by Doug Dreger at UC Berkeley.

Searchable Scanned Waveforms SiteThe Caltech Seismological Lab has made available 12,223 scanned images of pre-digital analog recordings of major earthquakes recorded in Southern California between 1962 and 1992 at http://www.data.scec.org/research/scans/. The SCEDC has developed a searchable web interface that allows users to search the available files, select multiple files for download and then retrieve a zipped file containing the results. Scanned images of paper records for M>3.5 southern California earthquakes and several significant teleseisms are available for download via the SCEDC through this search tool.

The COSMOS Strong-Motion Virtual Data CenterArchuleta, Ralph (UCSB), Jamison Steidl (UCSB), and Melinda Squibb (UCSB)

The COSMOS Virtual Data Center (VDC) is an unrestricted web portal to strong-motion seismic data records of the United States and 14 contributing countries for use by the engineering and scientific communities. A flexible, full range of search methods, including map-based, parameter-entry, and earthquake- and station-based searches enable the web user to quickly find records of interest; and a range of display and download options allow users to view data in multiple contexts, extract and download data parameters, and download data files in convenient formats. Although the portal provides the web user a consistent set of tools for discovery and retrieval, the data files continue to be acquired, processed and managed by the data providers to ensure the currency and integrity of the data. The Consortium of Organizations for Strong-Motion Observation Systems (COSMOS) oversees the development of the VDC through a working group comprised of representatives from government agencies, engineering firms and academic institutions. New developments include a

103

Page 101: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

more powerful and informative interactive map interface, configurable design spectra overlays on response spectra plots and enhanced download and conversion options.

As of August 2005, the VDC contains the searchable metadata for 507 earthquakes, 3,074 stations, and 25,718 traces. In the last few years substantial data sets representing earthquakes with magnitude greater than 5.0 have been added from the ChiChi, Taiwan earthquake, all New Zealand records from 1966-1999 and a array on the Pacific coast of Mexico, as well as smaller but seismically important data sets from Central Asia, Turkey, Peru and India and legacy data from California. The VDC incorporates all data available from the USGS and CISN with magnitude greater than 5.0 in highly seismic areas and greater than 4.5 in areas of low seismicity. Recent data sets from these sources include the 2004 Parkfield, CA Mw 6.0, the 2005 Northern California, Mw 7.2 and the 2005 Dillon, MT Mw 5.6 earthquakes. We are currently in the process of adding new data for UNR’s Guerrero array. The VDC also incorporates all data from the KNet and KikNet Japanese networks with magnitude greater than 5.0, depth less than 100km and with a pga of at least 0.1g.

The VDC has been funded by the National Science Foundation, under the Civil and Mechanical Systems Division (CMS-0201264), and COSMOS. The core members of COSMOS, the U.S. Geological Survey, the California Geological Survey, the U.S. Army Corps of Engineers and the U.S. Bureau of Reclamation, as well as contributing members, make their data available for redistribution by the VDC. COSMOS members, including representatives of the core members and members of the professional engineering community, provide ongoing operations and development support through an advisory working group.

Interseismic Strain Accumulation Across the Puente Hills Thrust and the Mojave Segment of the San Andreas FaultArgus, Donald F. (Jet Propulsion Laboratory)

Integrating GPS observations of SCIGN from 1994 to 2005, trilateration observations from 1971 to 1992, SCEC campaign observations from 1984 to 1992, VLBI data from 1979 to 2000, and SLR data from 1976 to 2000, we find the following:

SAN ANDREAS FAULTThe Mojave segment of the San Andreas fault is slipping at 20 +-4 mm/yr beneath a locking depth of 15 +-5 km [95% confidence limits]. The slip rate is significantly slower [marginally] than the 30 +-8 mm/yr consensus estimate from paleoseismology [Working Group on California Earthquake Probabilities 1995]. The locking depth is consistent with the 13-18 km seismogenic depth inferred from the maximum depth of earthquakes. The slip rate and locking depth are, respectively, slower and shallower than the 34 mm/yr and 25 km found by Eberhart-Phillips [1990] and Savage and Lisowski [1998]. Our values fit trilateration line length rates at distance 5 to 50 km from the fault whereas their values predict lines to lengthen or shorten more quickly than observed.

PUENTE HILLS THRUST, METROPOLITAN LOS ANGELESThe observation that northern metropolitan Los Angeles is shortening from north to south at 4.5 mm/yr tightly constrains elastic edge dislocation models of interseismic strain accumulation. Along a profile running NNE across downtown Los Angeles, the position at which a north-dipping thrust fault stops being locked and begins creeping must be 8 +-8 km north of downtown and 6 +-2 km deep, and the horizontal component of deep slip must be 9 +-2 mm/yr [95% confidence limits]. This suggests that the shallow segments of the [upper] Elysian Park Thrust and the [Los Angeles segment of] the Puente Hills thrust are locked in place and will slip in a future earthquake. The 6 km locking depth we find is inconsistent with the 15 km seismogenic depth inferred from the maximum depth of earthquakes. Differences between the rheology of sedimentary basin and crystalline basement must next be taken into account in the models of interseismic strain accumulation. The observations nevertheless suggest that a thrust beneath northern metropolitan Los Angeles is accumulating strain quickly, not a thrust at the southern front of the San Gabriel mountains.

104

Page 102: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

ANGULAR VELOCITIES OF SOUTHERN CALIFORNIA MICROPLATESWe present angular velocities describing the velocity among the North American plate, the Pacific plate, the Sierra Nevada-Great Valley, the west Mojave desert, the San Gabriel mountains, and the Santa Monica mountains. The 4 entities inside the Pacific-North America plate boundary zone are assumed to be elastic microplates, that is, the entities are assumed to deform only elastically in response to locking of the San Andreas and San Jacinto faults, and the predictions of screw dislocation models of interseismic strain accumulation is subtracted away from the observations before the angular velocities are estimated. The angular velocities bring predictions against which paleoseismic observations of fault slip and earthquake movements in the belts separating the microplates can be compared.

Inverse Analysis of Weak and Strong Motion Downhole Array Data: Theory and Applications

Assimaki, Dominic (GATech) and Jamison Steidl (UC Santa Barbara)

Current state-of-practice site response methodologies primarily rely on geot- echnical and geophysical investigation for the necessary impedance information, whereas attenuation, a mechanism of energy dissipation and redistribution, is typically approximated by means of empirical correlations. For nonlinear site response analyses, the cyclic stiffness degradation and energy dissipation are usually based on published data. The scarcity of geotechnical information, the error propagation of measurement techniques, and the limited resolution of the continuum, usually result in predictions of surface ground motion that poorly compare with low amplitude observations, a discrepancy even further aggravated for strong ground motion.

Site seismic response records may be a valuable complement to geophysical and geotechnical investigation procedures, providing information on the true material behavior and site response over a wide range of loading conditions. We here present a downhole seismogram inversion algorithm for the estimation of low-strain dynamic soil properties. Comprising a genetic algorithm in the wavelet domain, complemented by a local least-square fit operator in the frequency domain, the hybrid scheme can efficiently identify the optimal solution vicinity in the stochastic search space and provide robust estimates of the low-strain impedance and attenuation structures, which can be successively used for evaluation of approximate nonlinear site response methodologies. Results are illustrated for

selected aftershocks and the mainshock of the Mw 7.0 Sanriku-Minami earthquake in Japan.Inversion of low-amplitude waveforms is first employed for the estimation of low-strain dynamic soil properties at five stations. Successively, the frequency-dependent equivalent linear algorithm is used to predict the mainshock site response at these stations, by subjecting the best-fit elastic profiles to the downhole recorded strong motion. Finally, inversion of the mainshock empirical site response is employed to extract the equivalent linear dynamic soil properties at the same locations. The inversion algorithm is shown to provide robust estimates of the linear and equivalent linear impedance profiles, while the attenuation structures are strongly affected by scattering effects in the near-surficial heterogeneous layers. The forward and inversely estimated equivalent linear shear wave velocity structures are found to be in very good agreement, illustrating that inversion of strong motion site response data may be used for the approximate assessment of nonlinear effects experienced by soil formations during strong motion events.

105

Page 103: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Patterns of Crustal Coseismic Strain Release Associated with Different Earthquake Sizes as Imaged by a Tensor Summation Method

Bailey, Iain (USC), Thorsten Becker (USC), and Yehuda Ben-Zion (USC)

We use a method of summing potency tensors to study the temporal and spatial patterns of coseismic strain release in southern California. Tensors are calculated for individual earthquakes from catalog data, and we specifically target small events. By focusing directly on the smaller events and not performing any inversions, we are able to analyze a large data set, while minimizing assumptions that may affect the results and obscure finer details. We can then examine the case for or against existence of fixed finite length scales related to the strain release patterns. In this study, we focus on effects of earthquake and binning sizes with regards to the results of the summing process. A summation that takes into account the scalar potency of each event will lead to a dominance of large events on the final results, while neglecting this information will be less correct physically and lead to a dominance of the more numerous smaller events. Concentrating on five spatially defined “bins”, chosen to contain a sufficient number of data and come from a range of tectonic regions in southern California (e.g. the Eastern California Shear Zone and the San Jacinto Fault zone), we observe how the summation process can be affected by constraints imposed with regard to the size of events. We show how the results can fluctuate as a function of (i) number of events summed, (ii) region of spatial averaging (i.e., bin size), (iii) restricting the upper and lower magnitude for events summed, and (iv) whether we weight the events by their scalar potency or not. A similar degree of heterogeneity is observed at all scales, concurring with previous studies and implying a scale-invariant behavior. However, we can not yet conclude that this is indeed the case without further study of errors in the catalog data and possible artifacts associated with the analysis procedure.

Historical Earthquakes in Eastern Southern CaliforniaBakun, William (USGS)

Richter (1958) listed the Mojave Desert region as one of several structural provinces of California but concluded that it is almost non-seismic. The 1992 M7.3 Landers and 1999 M7.1 Hector Mine earthquakes showed how large earthquakes could link and rupture multiple short faults in the Eastern California Shear Zone (ECSZ). Historical seismicity should be reexamined knowing that large earthquakes can occur in the ECSZ. Here I discuss large earthquakes that occurred in southern California on 9 February 1890 and 28 May 1892 that are usually assumed to have occurred near the south end on the San Jacinto fault.There is little control on the locations of these earthquakes. However, MMI intensity assignments for the Landers and Hector Mine earthquakes at common sites are comparable to those available for the 1890 event. Townley and Allen (1939) noted that “this shock was felt also at places along the railroad between Pomona and Yuma with intensities about the same as at Pomona, probably VII.” The MMI assignment at Pomona is VI. MMI is V or VI for the Landers and the Hector Mine earthquakes at towns along the Southern Pacific railway between Pomona and Yuma. The MMI assignments are consistent with a location of the 1890 event in the ECSZ near the Landers and Hector Mine earthquakes. For an 1890 source located along the south end of the San Jacinto fault, an MMI of V at Pomona (rather than VI) and an MMI VI-VII at San Diego (rather than V) would be expected. Although the uncertainty in MMI assignments is about one MMI unit, there are more discrepancies in the MMI assignments for a source near the south end of the San Jacinto fault than in the ECSZ.

If on the Calico fault, where there is evidence of displacement 100 to a few thousand years ago, The intensity magnitude is 7.2 for the 1890 and 6.6 for the 1892 earthquakes. Langenheim and Jachens (2002) used gravity and magnetic anomalies to identify a mafic crustal heterogeneity, named the Emerson Lake Body (ELB), which apparently affects strain distribution and the slip in the 1992 Landers and 1999 Hector Mine earthquake sequences to the west and east of the Calico fault, respectively. The Calico fault, aligned along the east edge of the ELB, is situated where stress

106

Page 104: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

transferred from the Landers event would be concentrated. The failure of the Landers earthquake to trigger earthquakes on the Calico fault can be rationalized if large earthquakes occurred there in 1890 and 1892. These events suggest that the ECSZ has been seismically active since the end of the 19th century and that the earthquake catalog completeness level in the ECSZ is ~M6.5 at least until the early 20th century.

Some New and Old Laboratory Constraints on Earthquake Nucleation Beeler, N. M. (USGS) and B. Kilgore (USGS)

A simple view of time dependent earthquake nucleation from laboratory experiments is there are minimum and characteristic nucleation patch sizes, controlled by the rate dependence of the fault surface and the asperity contact dimension. Direct observations of nucleation (Okubo and Dieterich, 1984; 1986), 1D elastic analysis with lab-based constitutive equations (Dieterich, 1986), and some plane strain simulations support the simple view (Dieterich, 1992; Beeler, 2004). Based on extrapolations to natural stressing rates, laboratory measured parameters suggest that while the duration of nucleation is very long (typically months, or years), it would be impossible to resolve using surface and space-based strain sensors, and would be extremely difficult to detect in the subsurface using the most sophisticated borehole strain meters.

However, recent plane strain simulations using rate and state constitutive relations [Rubin and Ampuero, in press, JGR and unpublished] show that only for relatively large negative rate dependence is the nucleation patch size characteristic (stationary in time), and that different empirical lab-based evolution relations predict dramatically different behaviors. Rubin and Ampuero show that the key fault property controlling nucleation patch growth is the effective shear fracture energy- a parameter that unfortunately was not explicitly considered during the development of these relations. For fracture energy that increases with slip rate, under certain circumstances the nucleation patch size grows in time and produces detectable precursory slip.

In this study we review the experimental constraints on nucleation size, time dependent growth and the effective shear fracture energy from previous experiments conducted on a 2 m long fault, principally by Okubo and Dieterich (1984; 1986). We have conducted new experiments specifically for the purpose of studying nucleation, and constraining fracture energy. We have also developed new tests to better characterize slip and energy dissipation during nucleation. These tests can be used to develop better constitutive relations for nucleation and earthquake occurrence.

Discrete Element Modeling of Dynamic Rupture Interaction Between Parallel Strike-Slip Faults

Benesh, Nathan (Harvard), James Rice (Harvard), and John Shaw (Harvard)

The study of rupture propagation and initiation on parallel faults or fault segments by dynamic stress transfer is of great interest to the earthquake community. Small to moderate earthquakes can quickly become large, damaging earthquakes if rupture successfully steps from one fault segment to other adjacent, but not necessarily connected, fault segments. The 1992 Landers earthquake sequence and recent modeling and damage assessments of hypothetical Puente Hills fault ruptures illustrate the importance of understanding this interaction.We adapted the Particle Flow Code in 2 Dimensions (PFC2D), a commercial discrete element code distributed by the Itasca Consulting Group and based on the DEM as developed by Cundall [1971], to dynamically model rupture propagation along two non-coplanar fault segments in a setup similar to that of Harris and Day [1993]. Harris and Day approached the problem with a finite difference code that invoked a slip-weakening friction law. Others have examined similar problems using boundary integral equation formulations (Fliss et al. [2005]), and we have also examined them with finite element methods (ABAQUS); however, an examination of fault-stepping ruptures has not yet been

107

Page 105: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

undertaken in a discrete element framework.

For the PFC2D analysis, we created a map view area comprised of tens of thousands of 2-dimensional disks and containing two straight, parallel faults. Zero-thickness walls were positioned among the discrete disks to create the faults in order to make them asperity-free and promote possible slip. The slip-weakening failure model was implemented through use of the imbedded FISH programming language to calculate slip of the individual particles along the faults and adjust individual friction coefficients accordingly.

Though this discrete element study still makes use of the constitutive slip-weakening friction/failure law, it provides a needed comparison of the appropriateness of the discrete element framework for this type of problem as compared to the methods (FD, BIE, FE) mentioned previously. The successful application of DEM in this study would simply be the first step, as the advantages of the DEM method lie in its ability to go beyond a postulated constitutive law, assuming that the physics has been properly represented at the level of particle interactions. We hope to continue working with PFC2D to model earthquake ruptures governed not by a constitutive slip-weakening friction law, but by more realistic fault-zone processes such as fluid pressurization and thermal weakening. We also aim to represent inelastic off-fault deformation, comparing predictions to FE results and to natural observations.

SCEC/UseIT: Smarter Navigation, Smarter SoftwareBeseda, Addie (University of Oregon)

The SCEC/UseIT intern program has engineered the SCEC-VDO software to meet its summer grand challenge of creating an earthquake monitoring system that allows scientists to quickly visualize important earthquake-related datasets and create movies which explain seismological details of the events. The software uses a 3D engine (based upon Java and the Java3D extension) to model data on regional and global scales. Summer 2005 UseIT interns have added and improved upon functionality for SCEC-VDO, leaving an end product of a powerful earthquake visualization tool.

My interest as anintern and programmer has been in making the software "smarter" so that users can take advantage of the program with a minimal learning curve. Companies such as Apple and Google are thriving on their mastery of user interfaces that are simple and straight-forward, yet carry an amazing amount of power. My emphasis has been improving navigation in a 3D environment. I’ve created a "navigation-by-clicking" interface, in which a user can zoom and re-focus on a point by double-clicking on it. As simple as this task seems, it saves the user many of the additional mouse movements that would otherwise be needed to focus on the point. Making the software perform a simple, intuitive task requires a mastery of many fundamental concepts within Java3D and extensive brainwork to understand the details of camera position, interpolators, and other computer graphics “essentials”.

I've also undertaken a handful of small projects, all with the intention of making the SCEC-VDO software more user-friendly and intuitive.

108

Page 106: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

The B4 Project: Scanning the San Andreas and San Jacinto Fault ZonesBevis, Michael (OSU), Ken Hudnut (USGS), Ric Sanchez (USGS),

Charles Toth (OSU), Dorota Grejner-Brzezinska (OSU), Eric Kendrick (OSU), Dana Caccamise (OSU), David Raleigh (OSU), Hao Zhou (OSU),

Shan Shan (OSU), Wendy Shindle (USGS), Janet Harvey (UCLA), Adrian Borsa (UCSD), Francois Ayoub (Caltech), Bill Elliot (Volunteer),

Ramesh Shrestha (NCALM), Bill Carter (NCALM), Mike Sartori (NCALM), David Phillips (UNAVCO), Fran Coloma (UNAVCO),

Keith Stark (Stark Consulting), and the B4 Team

We performed a high-resolution topographic survey of the San Andreas and San Jacinto fault zones in southern California, in order to obtain pre-earthquake imagery necessary to determine near-field ground deformation after a future large event (hence the name B4), and to support tectonic and paleoseismic research. We imaged the faults in unprecedented detail using Airborne Laser Swath Mapping (ALSM) and all-digital navigational photogrammetry.

The scientific purpose of such spatially detailed imaging is to establish actual slip and afterslip heterogeneity so as to help resolve classic ‘great debates’ in earthquake source physics. We also

expect to be able to characterize near-field deformation associated with the along-strike transition from continuously creeping to fully locked sections of the San Andreas fault with these data.

In order to ensure that the data are extraordinarily well georeferenced, an abnormally intensive array of GPS ground control was employed throughout the project. For calibration and validation purposes, numerous areas along the fault zones were blanketed with kinematic GPS profiles. For redundant determination of the airborne platform trajectory, the OSU independent inertial measurement unit and GPS system were included in the flight payload along with the NCALM equipment. Studies using the ground control are being conducted to estimate true accuracy of the airborne data, and the redundant flight trajectory data are being used to study and correct for errors in the airborne data as well. All of this work is directed at overall improvement in airborne imaging capabilities, with the intent of refining procedures that may then be used in the large-scale GeoEarthScope project over the next few years, led by UNAVCO. More generally, we also intend to improve airborne imaging to the point of geodetic quality.The present NSF-funded project, led by Ohio State University and the U. S. Geological Survey, was supported in all aspects of the airborne data acquisition and laser data processing by the National Center for Airborne Laser Mapping (NCALM), in continuous GPS station high-rate acquisition by SCIGN, and in GPS ground control by UNAVCO. A group of volunteers from USGS, UCSD, UCLA, Caltech and private industry, as well as gracious landowners along the fault zones, also made the project possible. Optech contributed use of their latest scanner system, a model 5100, for the laser data acquisition along all of the faults scanned. The data set will be made openly available to all researchers as promptly as possible, but currently OSU and NCALM are still working on the data processing.

Supershear Slip Pulse and Off-Fault DamageBhat, Harsha (Harvard), Renata Dmowska (Harvard), and

James R. Rice (Harvard)

We extend a model of a two-dimensional self-healing slip pulse, propagating dynamically in steady-state with a slip-weakening failure criterion, to the supershear regime, in order to study the off-fault stressing induced by such a slip pulse and investigate features unique to the supershear range. Specifically, we show that there exists a non-attenuating stress field behind the Mach front which

109

Page 107: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

radiates high stresses arbitrarily far from the fault (practically this would be limited to distances comparable to the depth of the seismogenic zone), thus being capable of creating fresh damage or inducing Coulomb failure in known structures at large distances away from the main fault. We use this particular feature to explain anomalous ground cracking at several kilometers from the main fault during the 2001 Kokoxili (Kunlun) event in Tibet, in collaboration with Y. Klinger and G.C.P King of IPGP Paris, for which it has been suggested that much of the rupture was supershear.

We allow for both strike-slip and dip-slip failure induced by such a slip pulse by evaluating Coulomb stress changes on both known and optimally oriented structures. In particular we look for features of supershear slip pulse that could nucleate a slip-partitioning event at places where reverse faults exist near a major strike-slip feature. Such a configuration exists in Southern California near the big bend segment of the San Andreas Fault where there is active thrust faulting nearby. The most vulnerable locations would be those for which part of the presumably seismogenic thrust surface is within ~15-20 km of the SAF, which (considering dip directions) may include the Pleito, Wheeler Ridge, Cucamonga, Clearwater, Frazier Mountain, Alamo, Dry Creek, Arrowhead, Santa Ana, Waterman Canyon, and San Gorgonio faults, and reverse or minor right-reverse sections of the Banning and San Jacinto fault systems. Of course, many nearby strike slip segments could be vulnerable to the distant stressing too, at least if not oriented to close to perpendicular or parallel to the SAF. The degree of vulnerability has a strong dependence, to be documented, on directivity of the rupture on the SAF and orientation of the considered fault segment.

We also compare the damage induced by supershear slip pulse with their sub-Rayleigh analogues to look for unique signature left behind by such slip pulses in terms of off-fault damage. We show that off-fault damage is controlled by the speed of the slip-pulse, scaled stress drop and principal stress orientation of the pre-stress field. We also make some estimates of fracture energy which, for a given net slip and dynamic stress drop, is lower than for a sub-Rayleigh slip pulse, because part of the energy fed by the far-field stress is radiated back along the Mach fronts.

Rupture Scenario Development From Paleoseismic Earthquake EvidencesBiasi, Glenn (UN Reno), Ray Weldon (U. Oregon), and Tom Fumal (USGS)

We present progress in developing rupture scenarios for the most recent ~1400 years of the San Andreas fault based on published paleoseismic event evidence and fault displacement constraints. Scenarios presently employ records from eight sites from the Carrizo Plain to Indio and 46 total paleoquakes. The approach includes several novel aspects. First, the approach is consciously inclusive in regard to how known paleoquakes may or may not be part of multi-site ruptures. That is, the method recognizes that a rupture might consist of an individual paleoquake, or combine with one or more adjoining sites to form a multi-site rupture. Second, ruptures explicitly allow the case where there is no reported rupture at a neighboring site. If the center earthquake overlaps with the date distribution of a neighbor, that pairing is selected. If more than one overlap exists, both are recognized and carried forward. If there is no temporal overlap with a neighbor, the rupture is still allowed, with the interpretation that the evidence at the neighbor was not preserved or somehow not recognized. This strategy prevents missing evidence or incorrect dating of an earthquake at one site from trumping otherwise good evidence for correlation at sites on either side. Ruptures “missing” at several neighbor sites are, of course, less likely to be correct. Third, “missing” events are only included if the investigation should have seen a rupture if it was there. For example, the base of the reported Pitman Canyon record dates to ~900 AD, while nearby Burro Flat and Wrightwood records cover over 300 years more. Ruptures older than ~900 AD are not penalized for lack of evidence at Pitman Canyon. Constructed in this manner, several thousand unique ruptures are constructed. The exact number depends somewhat on rules for recognizing temporal overlap and the number of misses allowed before removing a rupture from the pool.

110

Page 108: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Scenarios consist of a selection of ruptures that together include all individual earthquakes from the record. This is done by selecting at random from the rupture list, removing other ruptures that also include any paleoquake in the chosen rupture, and selecting again. The pool of candidate ruptures shrinks, until ultimately every paleoquake is chosen. Together the selected set of ruptures comprises one scenario for the San Andreas fault. An unlikely scenario could include 46 prehistoric ruptures, none of which correlate with a neighbor. Scenarios with a large number of short ruptures are less likely to account for the total slip. On the opposite end, in 100,000 scenarios we find a few cases that include all earthquakes in as few as 14 ruptures. This is similar to scenarios drawn by hand where the preponderance of slip on the fault occurs during longer ruptures. We will present scoring mechanisms to assign probabilities to scenarios, including slip total, displacement correlations where available, and other considerations. Scenarios with some chance of representing the actual 1400 year history of the San Andreas fault can then be evaluated for their hazard implications using tools of Open SHA.

A Laboratory Investigation of Off-Fault Damage Effects on Rupture VelocityBiegel, Ronald L., Charles G. Sammis (USC), and Ares J. Rosakis (Caltech)

Rice et al. (2005) formulated an analytical model for dynamic propagation of a slip-pulse on a fault plane. Using earthquake parameters analyzed by Heaton (1990), they found that stress concentration at the rupture front should produce granulation of fault rock to a distance of a few meters and wall rock fracture damage to 10s of meters. This off-fault damage contributes to the fracture energy and therefore affects rupture velocity; an effect not addressed by the Rice et al. model. Our challenge is to quantify this feedback for incorporation into the rupture model.

To this end we conducted 35 experiments using photoactive Homalite samples(Xie et al., 2004. We measured rupture velocities in samples having off-fault "damage elements" introduced in the form of small slits of different lengths that intersected the fault plane over a range of angles.

Most experiments with longer damage elements oriented at low angle (30 degress) to the fault plane decreased the rupture velocity in the area of the element but did not nucleate new damage. We attribute this transient decrease in rupture velocity to the effect of crack blunting from the presence of the slit. In these cases the rupture velocity increased for a short distance beyond the damage element until the rupture displacement matched that expected for linear propagation in the absence of the damage element. In those experiments with shorter slits oriented at higher angles to the fault plane (60 degrees to 70 degrees) the damage element nucleated additional damage in the form of a tensile wing crack. In these cases, the higher fracture energy produced a permanent delay in rupture propagation.

Fault Length and Some Implications for Seismic Hazard Black, Natanya M. (UCLA), David D. Jackson (UCLA), and

Lalliana Mualchin (Caltrans)

We examine two questions: what is mx, the largest earthquake magnitude that can occur on a fault; and what is mp, the largest magnitude that should be expected during the planned lifetime of a particular structure. Most approaches to these questions rely on an estimate of the Maximum Credible Earthquake, obtained by regression (e.g. Wells and Coppersmith, 1994) of fault length (or area) and magnitude. Our work differs in two ways. First, we modify the traditional approach to measuring fault length, to allow for hidden fault complexity and multi-fault rupture. Second, we use a magnitude-frequency relationship to calculate the largest magnitude expected to occur within a given time interval.

111

Page 109: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Often fault length is poorly defined and multiple faults rupture together in a single event. Therefore, we need to expand the definition of a mapped fault length to obtain a more accurate estimate of the maximum magnitude. In previous work, we compared fault length vs. rupture length for post-1975 earthquakes in Southern California. In this study, we found that mapped fault length and rupture length are often unequal, and in several cases rupture broke beyond the previously mapped fault traces. To expand the geologic definition of fault length we outlined several guidelines: 1) if a fault truncates at young Quaternary alluvium, the fault line should be inferred underneath the younger sediments 2) along-strike faults within 45˚ of one another should be treated as a continuous fault line and 3) a step-over can link together faults at least 5 km apart.

These definitions were applied to fault lines in Southern California. For example, many of the along-strike faults lines in the Mojave Desert are treated as a single fault trending from the Pinto Mountain to the Garlock fault. In addition, the Rose Canyon and Newport-Inglewood faults are treated as a single fault line. We used these more generous fault lengths, and the Wells and Coppersmith regression, to estimate the maximum magnitude (mx) for the major faults in southern California. We will show a comparison of our mx values with those proposed by CALTRANS and those assumed in the 2002 USGS/CGS hazard model.

To calculate the planning magnitude mp we assumed a truncated Gutenberg-Richter magnitude distribution with parameters a, b, and mx. We fixed b and solved for the a-value in terms of mx, b, and the tectonic moment rate. For many faults, mp is relatively insensitive to mx and typically falls off at higher magnitudes because the a-value decreases with increasing mx when the moment rate is constrained. Since we have an assumed magnitude-frequency distribution for each fault, we will sum the values, and compare this to the catalog.

PBO Nucleus: Support for an Integrated Existing Geodetic Network in the Western U.S.

Blume, Frederick (UNAVCO), Greg Anderson (UNAVCO), Nicole Feldl (UNAVCO), Jeff Freymueller (U. of Alaska), Tom Herring (MIT), Tim Melbourne (CWU),

Mark Murray (UC Berkeley), WIll Prescott (UNAVCO), Bob Smith (U. of Utah), and Brian Wernicke (Caltech)

Tectonic and earthquake research in the US has experienced a quiet revolution over the last decade precipitated by the recognition that slow-motion faulting events can both trigger and be triggered by regular earthquakes. Transient motion has now been found in essentially all tectonic environments, and the detection and analysis of such events is the first-order science target of the EarthScope Project. Because of this and a host of other fundamental tectonics questions that can be answered only with long-duration geodetic time series, the incipient 1400-station EarthScope Plate Boundary Observatory (PBO) network has been designed to leverage 432 existing continuous GPS stations whose measurements extend back over a decade. The irreplaceable recording history of these stations is accelerating EarthScope scientific return by providing the highest possible resolution. This resolution will be used to detect and understand transients, to determine the three-dimensional velocity field (particularly vertical motion), and to improve measurement precision by understanding the complex noise sources inherent in GPS.

The PBO Nucleus project supports the operation, maintenance and hardware upgrades of a subset of the six western U.S. geodetic networks until they are subsumed by PBO. Uninterrupted data flow from these stations will effectively double the time-series length of PBO over the expected life of EarthScope, and has created, for the first time, a single GPS-based geodetic network in the US. The other existing sites remain in operation under support from non-NSF sources (e.g. the USGS), and EarthScope continues to benefit from their continued operation. On the grounds of relevance to EarthScope science goals, geographic distribution and data quality,

112

Page 110: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

209 of the 432 existing stations were selected as the nucleus upon which to build PBO. Conversion of these stations to a PBO-compatible mode of operation was begun under previous funding, and as a result data now flow directly to PBO archives and processing centers while maintenance, operations, and meta-data requirements are continue to be upgraded to PBO standards. At the end of this project all 209 stations will be fully incorporated into PBO, meeting all standards for new PBO construction including data communications and land use permits. Funds for operation of these stations have been included in planned budgets for PBO after the construction phase ends and PBO begins an operational phase in 2008.

The research community has only begun to understand the pervasive effects of transient creep, and its societal consequences remained largely unexplored. For example, one open question is whether slow faulting pervasively moderates earthquake nucleation. The existence of slow earthquakes will impact seismic hazards estimation, since these transients are now known to ‘absorb’ a significant component of total slip in some regions and trigger earthquakes in others. The data from these stations serve a much larger audience than just the few people who work to keep them operating. This project is now collecting the data that will be used by the next generation of solid-earth researchers for at least two decades. Educational modules are being developed by a team of researchers, educators, and curriculum development professionals, and are being disseminated through regional and national workshops. An interactive website provides the newest developments in tectonics research to K-16 classrooms.

Fault-Based Accelerating Moment Release Observations in Southern California

Bowman, Dave (CSUF) and Lia Martinez (SCEC/SURE Intern, Colorado School of Mines)

Many large earthquakes are preceded by a regional increase in seismic energy release. This phenomenon, called “accelerating moment release”(AMR), is due primarily to an increase in the number of intermediate-size events in a region surrounding the mainshock. Bowman and King (GRL, 2001) and King and Bowman (JGR, 2003) have described a technique for calculating an approximate geologically-constrained loading model that can be used to define regions of AMR before a large earthquake. While this method has been used to search for AMR before large earthquakes in many locations, most of these observations are “postdictions” in the sense that the time, location, and magnitude of the main event were known and used as parameters in determining the region of precursory activity. With sufficient knowledge of the regional tectonics, it should be possible to estimate the likelihood of earthquake rupture scenarios by searching for AMR related to stress accumulation on specific faults. Here we show a preliminary attempt to use AMR to forecast strike-slip earthquakes on specific faults in southern California. We observe significant AMR associated with scenario events along the "Big Bend" section of the San Andreas fault, suggesting that this section of the fault is in the final stages of its loading cycle. Earthquake scenarios on the San Jacinto fault do not show significant AMR, with the exception of the "Anza Gap". No significant AMR is found associated with the Elsinore fault.

113

Page 111: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

A New Community 3D Seismic Velocity Model for the San Francisco Bay Area: USGS Bay Area Velocity Model 05.0.0

Brocher, Thomas M., Robert C. Jachens, Russell W. Graymer, Carl M. Wentworth, Brad Aagaard, and Robert W. Simpson (USGS)

We present a new regional 3D seismic velocity model for the greater San Francisco Bay Area for use in strong motion simulations of the 1906 San Francisco and other Bay Area earthquakes. A detailed description of the model, USGS Bay Area Velocity Model 05.0.0, is available online [http://www.sf06simulation.org/]. The model includes compressional-wave velocity (Vp), shear-wave velocity (Vs), density, and intrinsic attenuation (Qp, Qs). The model dimensions are 290 km (along the coast) x 140 km (perpendicular to the coast) x 31 km (in depth).

As with the 1997 USGS Bay Area Velocity Model, the new model was first constructed as a 3D structural and geologic model with unit boundaries defined by crustal faults, differences in rock type, and stratigraphic boundaries. Fault geometries are based on double difference relocations of the seismicity, geologic mapping, and geophysical modeling of seismic and potential field data. Basin geometries are largely based on the inversion of gravity data. The model incorporates topography, bathymetry, and seawater. One of the advantages of this model, over smoothed models derived from seismic tomography alone, is that it offers sharper, and more realistic, interfaces and structural boundaries for wave propagation studies. The sharpness of boundaries is limited only by the discretization specified by the user.

To populate the structural model with physical properties, Vp versus depth curves were developed for each of the rock types in the Bay Area. These curves were developed from compilations of wireline borehole logs, vertical seismic profiles (conducted using surface sources and downhole receivers), density models, laboratory or field measurements on hand samples, and in situ estimates from seismic tomography and refraction studies. We developed empirical curves relating Vp and Vs, by compiling measurements of these properties for a wide-variety of common rock types under a variety of conditions (Brocher, 2005). To calculate density from Vp, we use Gardner’s rule for unmetamorphosed sedimentary units and equations proposed by Christensen and Mooney (1995), modified slightly in the upper 2.5 to 3 km, for other units. We use Qs versus Vs and Qs versus Qp relations developed for the Los Angeles basin by Olsen et al. (2003).

The model is distributed in a discretized form with routines to query the model using C++, C, and Fortran 77 programming languages. To minimize aliasing, the geologic model was discretized at higher resolution near the surface (maximum of 100m horizontal and 25m vertical) compared with depth (minimum of 800m horizontal and 200m vertical). The model contains material properties at nearly 190 million locations and is stored as an Etree database (Tu et al., 2003). The query routines provide a simple interface to the database, returning the material properties for a

given latitude, longitude, and elevation. The surfaces, including faults, stratigraphic boundaries, and other interfaces, used in the model are also available.

The new model should have a number of seismic and geodetic applications. Our hope is that feedback from users will eventually help us refine the model and point to regions and volumes where problems and inconsistencies exist.

114

Page 112: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Constraints on Extreme Ground Motions from Unfractured Precipitous Sandstone Cliffs along the San Andreas Fault

Brune, James N. (UNR)

Probabilistic seismic hazard analysis (PSHA) is based on statistical assumptions that are very questionable when extrapolated to very low probabilities,-- giving ground motions of the order of 10 g acceleration and 5 m/s velocity at 10-8 annual probabilities. The short historical database for instrumental recordings is not sufficient to constrain the extrapolations. This suggests that we look for geomorphic and geologic evidence constraining ground motions over long periods in the past.

Similar high ground motions from NTS nuclear explosions (ground accelerations of several g, and ground velocities of a few meters/sec) created spectacular mega-breccia rock avalanches in welded tuffs. Cliff faces were shattered and very large blocks of rock, several meters in dimensions, were moved horizontally and thrown downhill to form very impressive mega-breccia rubble piles.

Large sandstone outcrops occur at several locations along the San Andreas fault between Tejon Pass and Cajon Pass. These sandstones are as old as or older than the San Andreas fault and thus have been exposed to San Andreas earthquakes for about 5 million years. At the current inferred rate of occurrence of large earthquakes this might translate into about 20,000 M~ 8 events, with about 200 occurring in the last 50 ka, - enough to provide statistical constraints at very low probabilities. Preliminary measurements of tensile strength of surface samples of the San Andreas sandstones indicate values of less than 10 bars. If these values correspond to the true tensile strength of rocks in bulk at depth, over the history of the rocks, they provide constraints on very rare ground motions. Internally, if the particle velocities exceeded about 1 m/s at about 1/4_ wavelength depth, the internal strains would fracture the rocks in tension. There is no evidence of such tension fracturing.

At several sites there are near vertical sandstone cliffs several tens of meters high. Since the sandstones are considerably weaker than the welded tuffs at NTS, we would certainly expect mega-breccia rock avalanches of the type observed at NTS if similar extreme ground motions were to occur. There is no evidence of such avalanches. Although the precise erosion rate of the sandstones is not known, probably no such avalanches have been created in the last 10 ka to 100ka, if ever. The suggested upper limits on ground motion are consistent with the current rock instrumental strong motion data set (covering only about 50 yrs), and suggest that the large NTS-type ground motions have not occurred over the much longer history of the San Andreas Fault.

Response of Seismicity to Coulomb Stress Triggers and Shadows of the 1999 Mw = 7.6 Chi-Chi, Taiwan, Earthquake

Chan, Chung-Han (USGS and NCU), Kuo-Fong Ma (NCU), and Ross S. Stein (USGS)

The correlation between static Coulomb stress increases and aftershocks has thus far provided the strongest evidence that stress changes promote seismicity, a correlation that the Chi-Chi earthquake well exhibits. Several studies have deepened the argument by resolving stress change on aftershock focal mechanisms, which removes the assumption that the aftershocks are optimally oriented for failure. Here one compares the percentage of planes on which failure is promoted after the main shock relative to the percentage beforehand. For Chi-Chi we find a 28% increase for thrust and an 18% increase for strike-slip mechanisms, commensurate with increases reported for other large main shocks. However, perhaps the chief criticism of static stress triggering is the difficulty in observing predicted seismicity rate decreases in the stress shadows, or sites of Coulomb stress decrease. Detection of sustained drops in seismicity rate demands a long catalog with a low magnitude of completeness and a high seismicity rate, conditions that are met at Chi-Chi. We find four lobes with statistically significant seismicity rate declines of 40–90% for 50 months, and they coincide with the

115

Page 113: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

stress shadows calculated for strike-slip faults, the dominant faulting mechanism. The rate drops are evident in uniform cell calculations, 100-month time series, and by visual inspection of the M>3 seismicity. An additional reason why detection of such declines has proven so rare emerges from this study: there is a widespread increase in seismicity rate during the first 3 months after Chi-Chi, and perhaps many other main shocks, that might be associated with a different mechanism.

SCEC/UseIT: Integrating GraphicsChang, Diana (University of Pennsylvania)

The SCEC/UseIT program provides a collection of undergraduates a team oriented research environment to enhance a 3d visualization platform for research and media use. SCEC-VDO (SCEC Virtual Display of Objects) has the ability to display earthquake-related objects and record animated movies. Our goal was to build upon this software, developing its earthquake monitoring capabilities. Taking a graphics approach, several distinct contributions to the software included extending navigation controls, making structural changes to the scripting plugin, and implementing the Source Rupture Model plugin. By adding the ability to switch between third person and first person controls, we have simplified user navigation allowing for more precision. The movie making process has also been improved through camera path configuration modification, rebuilding the camera speed adjustment functionality, structural reorganization allowing for real time playback and smooth camera transitions between key frame selection and user navigation. Finally, we implemented a new interactive plugin displaying a database of finite source rupture models allowing for visualization of the rupture process of individual earthquakes.

Resolving Fault Plane Ambiguity Using 3D Synthetic SeismogramsChen, Po (USC), Li Zhao (USC), and Thomas H. Jordan (USC)

We present an automated procedure to invert waveform data for the centroid moment tensor (CMT) and the finite moment tensor (FMT) using 3D synthetic seismograms. The FMT extends the CMT to include the characteristic space-time dimensions, orientation of the source, and source directivity (Chen, et al. BSSA, 95, 1170, 2005). Our approach is based on the use of receiver-side strain Green tensors (RSGTs) and seismic reciprocity (Zhao et. al, this meeting). We have constructed a RSGT database for 64 broadband stations in the Los Angeles region using the SCEC CVM and K. Olsen’s finite-difference code. 3D synthetic seismograms can be easily computed by retrieving RSGT on a small source-centered grid and applying the reciprocity principle. At the same time, we calculate the higher-order gradients needed to invert waveform data for CMT and FMT. We have applied this procedure on 40 small earthquakes (ML < 4.8) in the Los Angeles region. Our CMT solutions are generally consistent with the solutions determined by Hauksson’s (2000) first-motion method, although they often show significant differences and provide better fits to the waveforms. For most small events, the low-frequency data that can be recovered using 3D synthetics (< 1 Hz) are usually insufficient for precise determination of all FMT parameters. However, we show the data can be used to establish the probability that one of the CMT nodal planes is the fault plane. For 85% of the events, we resolved fault plane ambiguity of our CMT solutions at 70% or higher probability. As discussed by Zhao et al. (this meeting), for the RSGTs can also be used to compute Fréchet kernels for the inversion of the same waveform data to obtain improved 3D models of regional Earth structure. This unified methodology for waveform analysis and inversion is being implemented under Pathway 4 of the SCEC Community Modeling Environment (Maechling et al., this meeting).

116

Page 114: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Visualization of Large Scale Seismic DataChourasia, Amit (SDSC) and Steve M. Cutchin (SDSC)

Large scale data visualization is a challenging task involving many issues with visualization technique, computation, storage and workflow. We present the use of three visualization techniques for creating high quality visualizations of seismic simulation data. Interactive techniques offer a good exploratory paradigm for visualization, the size and volume of this seismic simulation data makes general interactive visualization tools impractical. The use of non-interactive techniques, utilizing HPC and non-HPC resources enable us to create dramatic and informative scientific visualizations of seismic simulations. The combination of three techniques - volume rendering, topography displacement and 2d image overlaying can help scientists to quickly and intuitively understand the results of these seismic simulations.

We present a working pipeline utilizing batch rendering on a HPC cluster combined with a standard PC using commodity and in-house software. Proposed additions to the pipeline of co-located rendering and web based monitoring are also discussed.

SCEC/UseIT: Adding New Functionality to SCEC-VDOCoddington, Amy (Macalester College)

This summer the Use-IT intern program created an earthquake monitoring system out of the intern-made software from last summer, SCEC-VDO, a 3D visualization software developed to show earthquakes and faults in Southern California and throughout the world. I added three new plug-ins to the program, a highways plug-in, an Anza Gap plug-in, and a plug-in showing Martin Mai’s collection of slip rupture fault models. The highways plug-in in its previous state showed all the highways together; I worked on separating these so that the user can show them individually. Near Anza, California, is a place along the San Jacinto fault where the seismicity is very low in comparison to the rest of the fault. This seismically inactive place is where scientists believe the next big earthquake could occur. The Anza Gap plug-in helps scientists visualize where the gap in seismicity is by drawing a 3D box around the gap, allowing them to see if an earthquake has occurred in this zone. My final project was displaying a set of faults which show Mai’s slip function models on them, so scientists are able to visualize where the most movement has occurred along any given fault.

Dynamics of an Oblique-Branched Fault SystemColella, Harmony (CSUF) and David Ogelsby (UCR)

We use a dynamic 3-D finite element analysis to investigate rupture propagation and slip partitioning on a branched oblique fault system. Oblique slip on a dipping basal fault propagates onto vertical and dipping faults near the Earth’s surface. When the slip on the basal fault includes a normal component, the preferred rupture propagation is upward to the vertical surface fault. Conversely, a thrust component of slip on the base fault results in preferred propagation upward to the dipping surface fault. We also find that oblique slip on the basal fault results in partitioned slip on the near-surface faults, with more strike-slip motion at the surface trace of the vertical fault, and more dip-slip motion at the surface trace of the dipping fault. This result is in agreement with the static predictions of Bowman et al. (2003). The results also indicate that the stress interactions that exist in geometrically complex fault systems can lead to complexity in rupture propagation, including a crucial dependence on the direction of slip.

117

Page 115: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Seismic Hazard Assessment from Validated CFM-Based BEM ModelsCooke, Michele (UMass), Scott Marshall (UMass), and Andrew Meigs (OSU)

We have developed CFM-based three-dimensional Boundary Element Method (BEM) models that can be used to simulate deformation on either geologic or interseismic time scales. Meigs, Cooke, Graham and Marshall (this meeting) present a validation of the CFM-based three-dimensional fault configuration in the Los Angeles basin. This study supports this validation by showing that the geologic vertical uplift data set better delineates between the alternative fault systems than slip rate or slip directions on active faults. While variations in fault geometry do alter the slip rates, these effects are smaller than the range of certainty of paleosiesmic slip rates estimates. The horizontal interseismic velocities from the best fitting BEM model are also compared to geodetic results of Argus et al 2005. Using the best-fitting model to the geologic vertical uplift pattern (Meigs and others, this meeting) we assess possible earthquake recurrence and magnitude. Shear stress drops of earthquakes are invariant to magnitude and range from 1-3 MPa. From the three-dimensional BEM model we calculate the total stress drop over 5000 years on faults in the Los Angeles basin and, assuming a stress drop of 1 or 3 MPa per earthquake, determine recurrence intervals for moderate/large events on these faults. The magnitude of those earthquakes can be determined from the fault area and slip per event. Average earthquake recurrence rates on individual faults in the best-fitting model are 1000-3000 years with magnitudes 6-7 events. The stressing rates produced by this model may be used in seismic forecasting model such as RELM.

A Geoinformatics Approach to LiDAR / ALSM Data Distribution, Interpolation, and Analysis

Crosby, Christopher J. (ASU), Jeffrey Conner (ASU), Efrat Frank (SDSC), J. Ramón Arrowsmith (ASU), Ashraf Memon (SDSC),

Viswanath Nandigam (SDSC), Gilead Wurman (ASU), and Chaitan Baru (SDSC)

Distribution, interpolation and analysis of large LiDAR (Light Distance And Ranging, also known as ALSM (Airborne Laser Swath Mapping)) datasets pushes the computational limits of typical data distribution and processing systems. The high point-density of LiDAR datasets makes grid interpolation difficult for most geoscience users who lack the computing and software resources necessary to handle these massive data volumes. We are using a geoinformatics approach to the distribution, interpolation and analysis of LiDAR data that capitalizes on cyberinfrastructure being developed as part of the GEON project (www.geongrid.org). Our approach utilizes a comprehensive workflow-based solution that begins with user-defined selection of a subset of raw data and ends with download and visualization of interpolated surfaces and derived products. The workflow environment allows us to modularize and generalize the procedure. It provides the freedom to easily plug-in any applicable process, to utilize existing sub workflows within an analysis, and easily extend or modify the analysis using drag-and-drop functionality through a graphical user interface.

In this GEON-based workflow, the billions of points within a LiDAR dataset point cloud are hosted in an IBM DB2 spatial database running on the DataStar terascale computer at San Diego Supercomputer Center; a machine designed specifically for data intensive computations. Data selection is performed via an ArcIMS-based interface that allows users to execute spatial and attribute subset queries on the larger dataset. The subset of data is then passed to a GRASS Open Source GIS-based web service, “lservice”, that handles interpolation to grid and analysis of the data. Lservice was developed entirely within the open source domain and offers spline and inverse distance weighted (IDW) interpolation to grid with user-defined resolution and parameters. We also compute geomorphic metrics such as slope, curvature, and aspect. Users may choose to download their results in ESRI or ascii grid formats as well as geo tiff. Additionally, our workflow feeds into GEON web services in development that will allow visualization of Lservice outputs in either a web browser window or in 3D through Fledermaus’ free viewer iView3D.

118

Page 116: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

This geoinformatics-based system will allow GEON to host LiDAR point cloud data for the greater geoscience community, including data collected by the National Center for Airborne Laser Mapping (NCALM). In addition, most of the functions within this workflow are not limited to LiDAR data and may be used for distributing, interpolating and visualizing any computationally intensive point dataset. By utilizing the computational infrastructure developed by GEON, this system can democratize LiDAR data access for the geoscience community.

SDSC's Strategic Applications Collaborations Program Helps SCEC Researchers With Terascale Earthquake Simulation

Cui, Yifeng (SDSC), Giridhar Chukkapalli (SDSC), Leesa Brieger (SDSC), and Amitava Majumdar (SDSC)

The large-scale TeraShake simulation stretched SDSC resources across the board with unique computational as well as data challenges. SDSC computational experts in the Scientific Computing Applications Group worked closely with Anelastic Wave Model (AWM) developer Kim Olsen and others to port the code to Datastar, SDSC's IBM Power4 platform, and enhance the code to scale up to many hundreds of processors for a very large mesh size requiring large amount of memory. The integrated AWM resolves parallel computing issues related to the large simulation. These issues include MPI and MPI I/O performance improvement, single-processor tuning and optimization. Special techniques were introduced that reduced the code's memory requirements, making possible the largest and most detailed earthquake simulation of the southern San Andreas Fault in California of its time. The significant effort of testing, code validation, and performance scaling analysis took 30,000 allocation hours on DataStar to prepare for the final production run. The SDSC computational staff helped peform the final production run, which used 240 processors for 4 days and produced 43 TB of data on the GPFS parallel file system of Datastar. This final run, made possible by the enhancements to the code, dealt with a mesh of size 3000X1500X400 with 1.8 billion points at 200m resolution, ten times larger than previous case.

SDSC's computational collaboration effort was supported through the NSF-funded SDSC Strategic Applications Collaborations (SAC) and Strategic Community Collaborations (SCC) programs. The mission of the SAC/SCC programs is to enhance the effectiveness of computational science and engineering research conducted by nationwide academic users. The goal of the collaboration is to develop a synergy between the academic researchers and SDSC staff that accelerates the researchers' efforts by using SDSC resources most effectively and enables new science like TeraShake on relatively short timescales. Researchers are selected from diverse academic disciplines, including both traditional HPC application fields and new communities. TeraShake is a great example of this kind of collaboration. With the enhanced code that gives increased scalability, performance, and portability, Terashake SAC/SCC work also provides lasting value. This optimized code has been available to the earthquake community for future large-scale simulations.

In 2005, the SAC group expands its support to SCEC/CME community by enhancing TeraShake2 Rupture Dynamics features, porting CyberShake code for seismic hazard analysis, as well as providing the Advanced Support for Teragrid Application (ASTA). The collaboration is leading to resource allocation grants for SCEC of a million CPU hours on NSF Terascale facilities.

119

Page 117: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

2004 Parkfield Kinematic Inversion Using Strong-Motion Data Corrected by Site Effects

Custodio, Susana (UCSB), Pengcheng Liu (UCSB), and Ralph J. Archuleta (UCSB)

The Parkfield section of the San Andreas Fault is one of the most well studied fault zones in the world. A vast network of geophysical instruments monitors the region permanently. Short-term surveys complement our knowledge of the structure of the fault-zone. The 2004 Mw6.0 Parkfield earthquake was extensively recorded in the near-source region. We invert strong-motion data recorded within 32 km of the epicenter to find a kinematic slip model for this event. Because the Parkfield fault r

egion is very heterogeneous (Thurber et al., 2003; Eberhart-Phillips and Michael, 1993; Unsworth and Bedrosian, 2004), we account for site effects.

The kinematic inversion is based on a global inversion method (Liu and Archuleta, 2004) where the fault is divided into several subfaults and the source parameters (slip amplitude, rake angle, rise time and rupture velocity) are computed at the nodes (corners) of each subfault. We invert data in the range 0.16Hz-1.0Hz. We use two different 1D layered models for the velocity structure, one for each side of the fault. The bilateral 1D velocity model is interpolated from the 3D velocity models of Thurber et al. (2003) and Eberhart-Phillips and Michael (1993).

One of the most interesting features of the 2004 Parkfield earthquake was the large peak ground velocities (PGV) recorded on both ends of the rupture area. We use data from the Coalinga earthquake to infer site effects on the Parkfield array. The stations more affected by resonances (enhancement of certain frequencies) and local amplifications (general amplification of ground-motion at all frequencies) are close to the fault zone, often coincident with the large PGVs. The stations that most amplify ground-motion below 1.0Hz are FZ2 and FZ1, followed by FZ14, FZ10, FZ7, FZ6, GH1W and FZ3. Stations FZ14, FZ3, and CH1E, followed by FZ6, GH1W and FZ2 present the largest resonances. Of these, only FZ3 recorded a relatively low PGV during the Parkfield earthquake. On the other hand, only station FZ12 recorded an extremely high PGV and is not strongly affected by site effects.

After taking site effects into account, we obtain a slip model characterized by maximum slip amplitude about 65 cm, confined to a region directly below and to the SE of the hypocenter. A secondary region of large slip is located to the NW of the hypocenter, at a shallower depth (2-8km). Little or no slip occurs below 10km.

Reference Earthquake Digital Library -- Structure and TechnologyCzeskis, Alexei (SCEC/SURE Intern, Purdue), Brad Aagaard (USGS Menlo Park),

Jessica Murray (USGS Menlo Park), Anupama Venkataraman (Stanford), andGreg Beroza (Stanford)

Understanding earthquake processes involves integration of a wide variety of data and models from seismology, geodesy, and geology. While the geophysics community has established data centers to archive and distribute data in standard formats, earthquake models lack a similar facility. This often dramatically reduces their lifespan, because the models lack sufficient documentation and file format information to render them useful. The objective of the Reference Earthquakes Digital Library is to create the infrastructure to preserve the models in a central facility and to extend their lifespan for use in other studies.

120

Page 118: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

A critical part of the digital library infrastructure for reference earthquakes is the ability for the modeling community to submit and update models. Additionally, the metadata and model data files must be checked to insure that they conform to the standards established for the digital library. I designed and implemented browser based submission and update interfaces by extending the digital library developed by the San Diego Supercomputer Center. The interfaces use a combination of JavaScript, Perl, PHP, Java, and C to process and validate the submission information. I also adapted the data retrieval interfaces for Reference Earthquake use. These interfaces are part of a fully functional prototype Reference Earthquakes Digital Library for the 1992 Landers and 2004 Parkfield earthquakes. The digital library provides curation and ready access to archived models in standard formats with associated metadata in a common, permanent repository.

Faults Interaction through Off-Fault Open Cracks Induced by 3D Dynamic Shear Rupture Propagation

Dalguer, Luis A. (SDSU) and Steven M. Day (SDSU)

We developed 3D dynamic rupture models of parallel strike slip faults to study the interaction between step over faults considering off-fault open crack induced by the dynamic stress generated during shear rupture. The off-fault open cracks produce inelastic deformation in the surrounding volume of the faults. This off-fault dissipation mechanism results in distortion of the fault slip profilea and reduction of the shear rupture velocity due to energy loss during open crack propagation. However, these open cracks that take place in the tensional side of the fault, also produce fault linkage and coalescence and promote fault nucleation and growth of shear rupture on the neighbor step over faults. This second fault may coalesces with a third step over fault as well, and so on. The interaction between the step over faults occur when abrupt stopping of the shear rupture cause tensile crack propagation. In addition, the interaction of parallel faults results in a heterogeneity shear stress distribution on the neighbor faults, forming barriers and asperity patches naturally.

Analogies Related to the Public Understanding of Earthquake Science: Their Identification, Characterization, and Use in Educational Settings

de Groot, Robert (USC)

The use of analogies, similes, and metaphors is pervasive in communication. Robert Oppenheimer (1956) stated that analogy was an indispensable and inevitable tool for scientific progress. Use of analogy, simile, and metaphor in educational environments has long been used as a way to help people make connections between what is known and what is not known (Harrison & Treagust, 1994). This poster will explore the major themes and study areas for analogies, similes, and metaphors in earthquake science along with the pros and cons associated with their use. In addition to defining each word and providing examples of each, the history, efficacy, and the blurriness of the boundaries between these three “tools of thought” (Oppenheimer, 1956; Sutton, 1993; Lawson, 1993; Dreistadt, 1969) will be discussed. Shawn Glynn et al (1991) refers to analogies as double-edged swords because although they can be beneficial, they can also be detrimental. Challenges and problems associated with analogy use in science communication will be considered. The presenter will propose a framework for categorizing analogies and the creation of analogy catalogs that will aid in analogy selection and use in instructional materials and communication. Communicating concepts related to earthquake science in educational materials, in classroom instruction, and during media events is a challenging task for earth scientists. The difficulty of earthquake communication is often greatly amplified if the explanations are prepared and presented in the midst of a post-earthquake response. Information about earthquakes must be explained carefully otherwise misinformation and misconceptions can run rampant. It is clear that more insight should be shed upon the tools of language employed to convey concepts related to earthquake science for the public. Conference

121

Page 119: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

attendees are encouraged to contribute their favorite earthquake science analogy to the “earthquake analogy collection box” adjacent to this poster.

Geological Signals for Preferred Propagation Direction of Earthquake Ruptures on Large Strike-Slip Faults in Southern CA

Dor, Ory (USC), Yehuda Ben-Zion (USC), Tom Rockwell (SDSU), and Jim Brune (UNR)

Theoretical studies of mode II ruptures on a material interface indicate that: 1) Ruptures tend to evolve to wrinkle-like pulses that propagate preferentially in the direction of motion of the slower velocity block. 2) More damage is expected on the side of the fault with higher seismic velocity, which for the preferred propagation direction is the side that is persistently in the tensional quadrant of the radiated seismic field. Here we present systematic in-situ tests of these theoretical predictions. Observations made along sections of the San Andreas Fault (SAF), San Jacinto Fault (SJF) and the Punchbowl Fault (PF) in southern California indicate that these faults have asymmetric structure, with more damage on their northeastern side. The structural asymmetry that we observe has manifestations in the gouge scale (cm to meters), in the fault-zone scale (meters to 10s of meters) and in the damage-zone scale (10s to 100s of meters).

In three exposures of the SJF near Anza, heavily sheared gouge northeast of the principal slip surface (PSS) has up to 91% higher fracture density compare to a massive, inactive gouge on the southwest. South of Anza, inversion of seismic trapped waves shows that most of the 100 m wide damage-zone is also on the northeast side. Tomographic studies indicate that the more damaged northeast side has higher seismic velocity. On the SAF in two sites near Littlerock, the gouge scale damage is concentrated on the northeast side of the PSS. In Palmdale, a ~60m wide bedrock fault-zone shows considerably more SAF-related damage northeast of the PSS. Mapping of pulverized basement rocks along 140 km stretch of the SAF in the Mojave shows that pulverization on a 100 m scale is more common and more intense on the northeast side of the fault. Seismic imaging in this area indicates that the northeast side has higher seismic velocity. In the PF, highly fractured to pulverized sandstone on the northeast is juxtaposed against moderately damaged basement rocks to the southwest as indicated in gouge and fault-zone scales in two sites 1.5 km apart.

The correlation between the sense of damage asymmetry and the velocity structure on the SAF and on the SJF is compatible with a northwestward preferred propagation direction of ruptures along these faults, as predicted by the theory of rupture along a material interface. As for the PF, the excess in damage on the northeast side requires that the tensional quadrant of the radiated seismic field during paleoearthquakes was preferentially on the northeastern side, implying northwestward preferred propagation direction. These inferences could apply to other large strike-slip faults, where geological studies of symmetry properties may be utilized to infer on possible preferred propagation directions of earthquake ruptures. The observed correlation between symmetry properties of fault-zone damage with the local velocity structure suggests that the operating mechanism during large earthquakes in bi-material systems is rupture along a material interface. This can have important implications for improved understanding of earthquakes interaction and mechanism, and for refined estimates of seismic shaking hazard.

Multicycle Dynamics of Two Parallel Strike-Slip Faults with a Step-OverDuan, Benchun (UCR) and David D. Oglesby (UCR)

We combine a 2D elastodynamic model for the coseismic process and a 2D viscoelastic model for the interseismic process to study the dynamics of two parallel offset strike slip faults over multiple

122

Page 120: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

earthquake cycles. The elastodynamic model is simulated numerically by a new explicit finite element code. The viscoelastic model with an analytical solution approximates the stress loading and relaxation on faults during the interseismic period. We find that fault stresses become highly heterogeneous after multiple cycles near the step-over region. This heterogeneity in fault stresses can have significant effects on rupture initiation, event patterns, and the ability of rupture to jump the segment offset.The fault system tends to develop a steady state in which the fault stress and event patterns are stable. We find that rupture tends to initiate near the step-over. Depending on nature of the step-over (compressional or dilational), width and overlap length, in the steady state we typically see one of two patterns: ) rupture alternates between the two faults in two consecutive events, and 2) the two faults rupture together in all events.The heterogeneous stress pattern that develops over multiple earthquake cycles results in rupture being able to jump larger fault offsets than has been observed in studies with homogeneous initial stresses.

Observing Supershear Ruptures in the Far-FieldDunham, Eric (Harvard)

The dynamics of the supershear transition suggests the coexistence of two slip pulses, one propagating at a supershear speed and the other propagating around the Rayleigh wave speed. Evidence for this comes from both seismic observation of the two pulses during the 2002 Denali Fault earthquake and from laboratory experiments. The far-field radiation from such a rupture process (containing an initial sub-Rayleigh segment followed by a segment in which the two slip pulses described above simultaneously exist) is examined. P and S waves from the sub-Rayleigh segment arrive in the order in which they are emitted from the fault, with the signal appropriately compressed or stretched to account for directivity effects. For the supershear segment, a far-field S-wave Mach cone is defined. Outside of the Mach cone, S waves arrive in the order in which they were emitted from the source, as for the sub-Rayleigh segment. Inside the Mach cone, S waves arrive in the opposite order, with the first arrival having been emitted from the end of the supershear segment. On the Mach cone, waves from each point of the supershear segment arrive simultaneously, and the directivity pattern reaches its maximum here, rather than in the forward direction.

Overlap of arrivals from the supershear and sub-Rayleigh segments, which will occur within the Mach cone, complicates matters. Depending on the superposition of the waveforms, the maximum amplitudes may return to the forward direction. Consequently, standard techniques to estimate rupture velocity (e.g., plots of amplitude vs. azimuth, or finding a best-fitting constant rupture velocity model) could interpret this as a sub-Rayleigh directivity pattern. A potential solution involves examining properties of the waves in the spectral domain. These ideas are illustrated by generating synthetic seismograms of direct S waves and surface waves from a model of the Denali Fault event.

SORD: A New Rupture Dynamics Modeling CodeEly, Geoffrey (UCSD), Bernard Minster (UCSD), and Steven Day (SDSU)

We report on our progress in validating our rupture dynamics modeling code, capable of dealing with nonplanar faults and surface topography.

The method use a “mimetic” approach to model spontaneous rupture on a fault within a 3D isotropic anelastic solid, wherein the equations of motion are approximated with a second order Support-Operator method on a logically rectangular mesh. Grid cells are not required to be parallelepipeds, however, so that non-rectangular meshes can be supported to model complex regions. However, the for areas in the mesh which are in fact rectangular, the code uses a streamlined version of the algorithm that takes advantage of the simplifications of the operators in such areas.

123

Page 121: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

The fault itself is modeled using a double node technique, and the rheology on the fault surface is modeled through a slip-weakening, frictional, internal boundary condition. The Support Operator Rupture Dynamics (SORD) code, was prototyped in MATLAB, and all algorithms have been validated against known (analytical, eg Kostrov) solutions or previously validated solutions. This validation effort is conducted in the context of the SCEC Dynamic Rupture model validation effort led by R. Archuleta and R. Harris. Absorbing boundaries at the model edges are handled using the perfectly matched layers method (PML) (Marcinkovich & Olsen, 2003). PML is shown to work extremely well on rectangular meshes. We show that

our implementation is also effective on non-rectangular meshes under the restriction that the boundary be planar.

SORD has now been ported to Fortran 95 for multi-processor execution, with parallelization implemented using MPI. This provides a modeling capability on large scale platforms such as the SDSC DataStar machine, the various Teragrid platforms, or the SCEC High-performance computing facility. We will report on progress in validating that version of the code.

SORD —including both the MATLAB prototype and the FORTRAN parallel version— is intended to be contributed to the Community Modeling environment (CME).

SCEC/UseIT: Intuitive Exploration of Faults and Earthquakes Evangelista, Edgar (USC)

Within the Southern California Earthquake Center (SCEC), the Undergraduate Studies in Earthquake Information Technology (UseIT) program gathers multiple undergraduate students from various colleges to manage and expand SCEC-Virtual Display of Objects (SCEC-VDO). After last summer’s interns’ initial development of SCEC-VDO, this summer, we added numerous functionalities and serviced existing parts of the software. We created an earthquake monitoring system that visualizes and analyzes earthquakes and fault systems in 3D for earthquake scientists, emergency response teams, media, and the general public. In order to create a more intuitive package, I worked on several projects including navigation, orientation, fault mapping, and the general design of the software.

Within navigation, I added on two modes of navigation in which the user feels either within the world or controlling the world in his/her hands by using the mouse and/or keyboard. Furthermore, I created a navigational axes system and a more “natural” lighting system to aid the users in orienting themselves within the scene. Additionally, I created features for the USGS’96 faults to aid in user visualization. Finally, by adding background and grid customizability, the program can become easier to view. Through these additional features, the user can interact with their research more intuitively.

The SCEC Community Modeling Environment Digital LibraryFaerman, Marcio (SDSC), Reagan Moore (SDSC), Yuanfang Hu (SDSC),

Yi Li (SDSC), Jing Zhu (SDSC), Jean Bernard Minster (UCSD), Steven Day (SDSU), Kim Bak Olsen (SDSU), Phil Maechling (USC),

Amit Chourasia (SDSC), George Kremenek (SDSC), Sheau-Yen Chen (SDSC), Arcot Rajasekar (SDSC), Mike Wan (SDSC), and Antoine de Torcy (SDSC)

The SCEC Community Modeling Environment (SCEC/CME) collaboration generates a wide variety of data products derived from diverse earthquake simulations. The datasets are archived in the SCEC Community Digital Library, which is supported by the San Diego Supercomputer Center (SDSC)

124

Page 122: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Storage Resource Broker (SRB), for access by the earthquake community. The digital library provides multiple access mechanisms needed by geophysicists and earthquake engineers.

Efforts conducted by the SCEC/CME collaboration include TeraShake, a set of large scale earthquake simulations occurring on the southern portion of the San Andreas Fault. TeraShake has generated more than 50 TB of surface and volume output data. The data has been registered into the SCEC Community Digital Library. Derived data products of interest include the surface velocity magnitude, peak ground velocity, displacement vector field and spectra information. Data collections are annotated with simulation metadata to allow data discovery operations on metadata-based queries. High resolution 3D visualization renderings and seismogram analysis tools have been used as part of the data analysis process.

Another collaboration effort between the Pacific Earthquake Engineering Research Center (PEER) and the SCEC project, called "3D Ground Motion Project for the Los Angeles Basin", has produced 60 earthquake simulation scenarios. The earthquake scenarios comprise 10 LA Basin fault models, each associated with 6 source models. The surface output data of these simulations is registered at the SCEC Digital Library supported by the SDSC Storage Resource Broker.The Digital Library will also hold the data produced by the recent CyberShake SCEC/CME project, calculating Probabilistic Seismic Hazard curves for several sites in the Los Angeles area using 3D ground motion simulations.

Seismologists and earthquake engineers can access both the TeraShake and the Los Angeles Basin collections using a Scenario-Oriented Interface developed by the SCEC/CME project. An interactive web application allows users to select an earthquake scenario from a graphical interface, choosing earthquake source models, and then use the WebSim seismogram plotting application and a metadata extraction tool. Users can click on a location and obtain seismogram plots from the synthetic data remotely archived at the digital library. A metadata extraction tool provides users a pull-down menu, with metadata categories describing the selected earthquake scenario. In the case of TeraShake, for example, users can interact with the full resolution 1 TB surface online data, generated for each simulation scenario. Other interfaces integrated with the SCEC/CME Digital Library include the Data Discovery and Distribution System (DDDS), a metadata-driven discovery tool for synthetic seismograms developed at USC and the Synthetic and Observed Seismogram Analysis (SOSA) application, developed at IRIS.

We are currently investigating the potential of integrating the SCEC data analysis services with the GEON GIS environments. SCEC users are interested in interactively selecting layers of map and geo based data to be accessed from seismic oriented applications. HDF5 (Hierarchical Data Format 5) will be used to describe binary file contents containing multi-dimensional objects. The SCEC Community Digital Library currently manages more than 100 Terabytes of data and over 2 million files.

References:SCEC Community Library: http://www.sdsc.edu/SCECSCEC/CME: http://www.scec.org/cmeStorage Resource Broker: http://www.sdsc.edu/srb

Earthquake Nucleation on Bended Dip-Slip FaultsFang, Zijun (UC Riverside) and Guanshui Xu (UC Riverside)

Many studies have indicated that fault geometry plays a significant role in the earthquake nucleation process. Our previous parametric study of earthquake nucleation on planar dip-slip faults with depth-dependent friction laws shows that fault geometry and friction play equally important role in the location of earthquake nucleation and the time it takes for such nucleation to occur. Therefore, it seems to be rational to speculate that for non-planar dip-slip faults, commonly existing in nature, fault

125

Page 123: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

geometry may play a more dominant role in the earthquake nucleation process. Using a slip-strengthening and weakening friction law in a quasi-static model based on the variational boundary integral method, we have investigated the effect of the bend angle on the earthquake nucleation process on both bended thrust and normal dip-slip faults. The detailed results of the nucleation location and time as a function of the bend angle will be presented.

Validation of Community Fault Model Alternatives From Subsurface Maps of Structural Uplift

Fawcett, Della (Oregon State), Andrew Meigs (Oregon State), Michele Cooke (UMass, Amherst), and Scott Marshall (UMass, Amherst)

In tectonically active regions characterized by blind thrust faults, there is potential to produce large scale earthquakes. Folds in overriding rock units are the primary indicator of these faults, both in size and displacement. Competing models of three-dimensional fault topology, starting from the SCEC Community Fault Model (CFM), were tested for viability using numerical Boundary Element Method (BEM) models and patterns of rock uplift by folds along the Northern LA Basin shelf, from Santa Monica east to the Coyote Hills. Well data and cross sections were used to construct surfaces in the ArcInfo GIS of the depth to three marker beds. Structure contour maps of the Quaternary (1.8 Ma), Pico (2.9 Ma) and Repetto (4.95 Ma) units were created. Contouring issues revolve around the frame of reference used to constrain differential structural relief across the region. Artifacts are introduced when structural relief is based on fold form on individual cross sections. Using the central trough as a frame of reference for measuring the amount of relief regionally results in more robust accurate structure contours. These maps constrain location and orientation of structural highs, which are indicative of fault location at depth. Rock uplift rate maps were constructed from these data and compared with rates generated by BEM models of 3D fault topology (North-South contraction at 100 nanostrain/year). BEM models investigate the sensitivity of uplift patterns to 1) dip of blind thrust faults (e.g. Las Cienegas and Elysian Park), 2) presence of a low-angle (~20 degree) thrust ramp below 10 km depths and 3) regional extent of this low-angle ramp. Contours of misfit represent model uplift subtracted from measured uplift, which test model-data compatibility in terms of structural trend, spatial variation in rates and location of major structures (i.e. key near surface folds). Alternative models to the CFM in the region of downtown LA in which the Los Angeles/Las Cienegas and Elysian Park blind thrust faults dip 60 degrees and sole into a regionally extensive low-angle ramp below 10 km depth improves model and geologic uplift compatibility.

Weak Faults in a Strong Crust: Geodynamic Constraints on Fault Strength, Stress in the Crust, and the Vertical Distribution of

Strength in the LithosphereFay, Noah and Gene Humphreys (University of Oregon)

We present results of steady-state dynamic finite element numerical models for the state of stress and strain rate in the crust and upper mantle in the vicinity of a transform fault. Model rheology is elastic-viscous-plastic where plastic mechanical behavior is used as a proxy for pressure-dependent friction of the seismogenic crust. Viscous flow is incorporated as temperature dependent, power-law creep. We assume that the crust outside the fault zone is at or near its Byerlee’s law-predicted frictional yield strength (i.e., “strong”, e.g., Townend and Zoback, 2001) and aim to determine the acceptable range of fault strength and viscosity distributions that satisfy the observations that seismic faulting extends to typically 15 km and that the tectonic strain rate of fault-bounding blocks is small. Assuming the traditional “christmas-tree” strength distribution of the lithosphere (e.g., Brace and Kohlstedt, 1980), our primary results are the following: The upper limit of fault strength is approximately 20-25 MPa (avg. over 15 km). The majority (>50%) of the vertically integrated strength

126

Page 124: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

of the lithosphere resides in the uppermost mantle. The depth to which the fault-bounding crust obeys Byerlee’s law depends on the strength of nearby faults and viscosity of the lower crust and should not exceed approximately 6-8 km, below which relatively low strain rate viscous creep is the dominant deformation mechanism.

The June 2005 Anza/Yucaipa Southern California Earthquake SequenceFelzer, Karen (USGS)

On June 12, 2005, a M_W 5.2 earthquake occurred on the San Jacinto fault system near the town of Anza and was felt throughout Southern California. Two subsequent events appeared unusual. The first was an elongated cluster of aftershocks along the San Jacinto Fault zone. The mainshock fault length was on the order of several km, as was the length of the most densely clustered part of the aftershock sequence, but a clear scattering of (mostly small) aftershocks also extended from 30 km to the north to at least 20 km to the south of the mainshock hypocenter. This raised early speculation that aftershocks were being triggered by a creep event along the San Jacinto. A creep event at depth has now been observed, with an estimated moment equivalent to that of an M 5.0 (Agnew and Wyatt, 2005). Whether or not this creep event is unusual, and thus whether it could have created an unusual aftershock sequence, is not possible to say as a lack of instrumentation elsewhere has prevented similar observation. Aseismic afterslip at the surface is routinely observed, and aseismic slip below the surface on both the mainshock and surrounding faults was inferred after the Loma Prieta earthquake from GPS measurements (Segall et al. 2000).

An alternative explanation for the elongated sequence is that aftershock sequences are always this long -- we just usually can't tell because we usually don't record at the completeness provided by the densely instrumented Anza seismic array. To test this hypothesis we plot the Anza aftershock sequence with different lower magnitude cutoffs; only when we include magnitudes below the normal completeness threshold does the sequence appear to be long. We also use the Felzer and Brodsky (2005) relationships to simulate what the aftershock sequence of an M_W 5.2 earthquake should look like with very small magnitudes included. The simulations agree well with observation if we restrict most of the aftershocks to the trend of the San Jacinto fault zone.

The other potentially unusual event was the occurrence of a M_W 4.9 earthquake 4 days after the Anza mainshock, 72 km away. There is less than a 2% probability that this earthquake was random chance; over the past five years (2000-2005) there have been only 8 M > 4.9 earthquakes in Southern California. Is it plausible that a M_W 5.2 earthquake could have triggered another earthquake over 70 km away? Static stress changes are negligible at such distances, but dynamic stress changes are not. Using M 5-6 mainshocks from throughout California we demonstrate that triggering can occur out to several hundred kilometers at high statistical significance, corroborating statistics by Ebel and Kafka (2002). On average, at distances over 50 km, within a time period of 7 days, we expect an M 5.2 earthquake to trigger a M > 4 earthquake about 10% of the time and to trigger a M > 5 earthquake about 1% of the time.

A Closer Look at High Frequency Bursts Observed During the 1999 Chi-Chi, Taiwan Earthquake

Fischer, Adam (USC) and Charles Sammis (USC)

High frequency, band-pass filtered waveforms from the 1999 Mw = 7.6 Chi-Chi, Taiwan earthquake show a multitude of distinct, short duration energy bursts. Chen et al. (B.S.S.A., 2005, accepted for publication) assumed the sources of these bursts were slip patches on or near the Chelungpu fault

127

Page 125: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

plane and devised a location algorithm that resolved about 500 unique events. Based on these locations and the relative sizes of the bursts they reached the following conclusions:

1) The earliest bursts occur directly up-dip from the hypocenter and appear to have been triggered by the P wave.2) The first bursts at greater distances from the epicenter appear to be triggered by the propagating rupture front.3) Later events at all distances follow Omori’s law if time is measured from the arrival of the rupture front at each distance.4) The size distribution of the bursts is described by the Gutenberg-Richter distribution over a range of two magnitude units.5) Most shallow events are small. All deep events are large.

In this study, we test the hypothesis that the high frequency bursts originate on the fault plane, and are not path or site effects. For several events, the vector displacements were measured at a number of 3-component stations and compared with the predicted radiation pattern from double-couple point source at the observed hypocenter in a homogeneous half-space. We also show that small events occurring at depth would be resolved by the array, and hence conclusion (5) above is not an artifact of attenuation. We present further evidence that the events are not site effects by showing they are not correlated with the amplitude of direct P and S. Because the time and magnitudes of the bursts appear to obey both Gutenberg-Richter and Omori laws, we further investigate their relationship with the normal aftershock sequence following the mainshock.

Direct Observation of Earthquake Rupture Propagation in the 2004 Parkfield, California, Earthquake

Fletcher, Jon (USGS), Paul Spudich (USGS), and Lawrence Baker (USGS)

Using a short-baseline seismic array (UPSAR) about 12 km west of the rupture initiation of the September, 2004 M6 Parkfield, CA earthquake, we have observed the movement of the rupture front of this earthquake on the San Andreas Fault. Apparent velocity and back azimuth of seismic arrivals are used to map the location of the sources of these arrivals. We infer that the rupture velocity was initially supershear to the northwest then slowing near the town of Parkfield after 2s. The observed direction of supershear propagation agrees with numerical predictions of rupture on bimaterial interfaces. The last well correlated pulse, 4s after S, is the largest at UPSAR and its source is near the region of large accelerations recorded by strong motion accelerographs. Coincidence of sources with pre-shock and aftershock distributions suggests fault material properties control rupture behavior. Small seismic arrays may be useful for rapid assessment of earthquake source extent.

SCEC/UseIT: Integration of Earthquake Analysis into SCEC-VDOFonstad, Rachel (Winona State)

I was inspired after Lucy Jones came to give us a presentation about how scientists monitor earthquakes and the ways that earthquake statistics are used, and I wanted to add the option of creating a magnitude-frequency plot in SCEC-VDO. I created a new plug-in that allows the user to choose and load a catalog of earthquakes and then plot magnitude versus log(frequency). The plot displays in a new window with a regression line and equation. This helps us to understand how smaller magnitude earthquakes are much more common and how the change in occurrence of larger and larger magnitude earthquakes is quite relative. We look at the plot of the logarithm of frequency rather than frequency itself because the slope decreases exponentially and is difficult to visualize without this adjustment.

128

Page 126: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

This plug-in was put under a new menu titled “Analysis” as it is intended to be the first among other plug-ins that will help to analyze earthquakes. Now, for the first time, SCEC-VDO is more than just a visualization tool. Before the implementation of these new plug-ins, we could only opt to display groups of earthquakes on the map. Beyond my plug-in, there will eventually be more plug-ins to help users better understand earthquakes.

Determination of Slip Rates on the Death Valley-Furnace Creek Fault System: Towards an Understanding of the Spatial and Temporal Extent of

Strain Transients -- A Progress ReportFrankel, Kurt L., James F. Dolan (USC), Robert C. Finkel (LLNL), Lewis A. Owen (Univ. of Cincinnati), and Jeffrey S. Hoeft (USC)

There has recently been great interest within both SCEC and the broader geodynamics community in the occurrence (or absence) of strain transients at a variety of spatial and temporal scales. Of particular interest are comparisons of geodetic and geologic rate data across the Mojave section of the eastern California shear zone (ECSZ), which suggest that the rapid geodetic rates measured in the region (~10-12 mm/yr) may be much faster than longer-term geologic rates. The possible strain transient revealed by these data contrasts markedly with rate data from the Big Bend section of the San Andreas fault (SAF) and the central Garlock fault, where geologic and geodetic data indicate that these structures are storing elastic strain energy at rates that are slower than their long-term fault slip rates. These comparisons of geologic and geodetic rate data raise a basic question: Are strain transients local features of only limited extent, perhaps tied to the geometric complexity (e.g. the Big Bend section of the SAF)? Alternatively, are strain transients such as in the Mojave more regionally extensive phenomenon that characterize loading of large sections of the plate boundary? The answers to these questions have fundamental implications for our understanding of the geodynamics of fault loading, and hence, for the occurrence of earthquakes. We are using cosmogenic nuclide geochronology coupled with airborne laser swatch mapping (ALSM) high-resolution digital topographic data (LiDAR) to generate slip rates from the Death Valley fault system in order to fill in one of the last major missing pieces of the slip rate “puzzle” in the ECSZ. We have collected ALSM data from two, 2 X 10 km swaths along the northern Death Valley fault zone. These data reveal numerous alluvial fan offsets, ranging from 82 - 390 m. In addition, we have access to ALSM data from the normal fault system in central Death Valley (in collaboration with T. Wasklewicz). We are currently processing cosmogenic nuclide samples from five sites to establish slip rates at a variety of time scales along the central and northern Death Valley fault zone. The proposed data, in combination with published rates, will provide a synoptic view of the cumulative slip rates of the major faults of the ECSZ north of the Garlock fault. Comparison of these longer-term rate data with short-term geodetic data will allow us to determine whether strain storage and release have been constant over the Holocene-late Pleistocene time scales of interest, or whether the current strain transient observed in the Mojave section of the ECSZ extends away from the zone of structural complexity associated with the Big Bend of the San Andreas fault.

Slip Rates, Recurrence Intervals and Earthquake Magnitudes for the Southern Black Mountain Fault Zone, Southern Death Valley, CaliforniaFronterhouse Sohn, Marsha (CSUF), Jeffrey Knott (CSUF), and

David Bowman (CSUF)

The normal-oblique Black Mountain Fault zone (BMFZ) is part of the Death Valley fault system. Strong ground-motion generated by earthquakes on the BMFZ poses a serious threat to the Las Vegas, NV area (pop. ~1,428,690), the Death Valley National Park (max. pop. ~20,000) and

129

Page 127: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Pahrump, NV (pop. 30,000). Fault scarps offset Holocene alluvial-fan deposits along most of the 80-km length of the BMFZ. However, slip rates, recurrence intervals, and event magnitudes for the BMFZ are poorly constrained due to a lack of age control. Also, Holocene scarp heights along the BMFZ range from <1 m to >6 m suggesting that geomorphic sections have different earthquake histories.

Along the southernmost section, the BMFZ steps basinward preserving three post-late Pleistocene fault scarps. Regression plots of vertical offset versus maximum scarp angle suggest event ages of < 10 – 2 ka with a post-late Pleistocene slip rate of 0.1 mm/yr – 0.3 mm/yr and recurrence of < 3300 years/event. Regression equations for the estimated geomorphically constrained rupture length of the southernmost section and surveyed event displacements provides estimated moment magnitudes (Mw) between 6.6 and 7.3 for the BMFZ.

A Full-Crustal Refraction/Reflection Model of LARSE Line 2: Thrusting of the Santa Monica Mountains-San Fernando Valley Block Beneath the Central

Transverse Ranges, Southern CaliforniaFuis, Gary S. (USGS), Janice M. Murphy (USGS), Shirley Baher (USGS),

David A. Okaya (USC), and Trond Ryberg (GFZ-Potsdam)

LARSE line 2, an explosion refraction/reflection line, extended from the coast at Santa Monica, California, northward to the Sierra Nevada, passing through or near the epicenters of the 1971 and 1994 M 6.7 San Fernando and Northridge earthquakes. Tomographic models of Lutter et al. (BSSA, 2004) contain tantalizing hints of geologic structure, but are necessarily smoothed and extend no deeper than about 8 km depth. Vertical-incidence reflection data (Fuis et al., Geology, 2003) reveal some important deep structures, but are low-fold and produce the best images below the upper-crustal region modeled by tomography. To sharpen the velocity image and to extend it to the Moho, we have forward-modeled both refraction and wide-angle reflection data and have incorporated surface geologic and sparse drillhole constraints into the model. The resulting image reveals the shapes of sedimentary basins underlying the San Fernando, Santa Clarita, and Antelope (western Mojave) Valleys as well as many active and inactive faults in the upper crust. Importantly, it also reveals a major north-dipping wide-angle-reflective zone extending downward from the 1971 San Fernando hypocenter toward the San Andreas fault (SAF). This zone, coincident with a vertical-incidence-reflective zone, is interpreted as a shear zone separating rocks of the Santa Monica Mountains and San Fernando Valley (SMM-SFV) from rocks of the Central Transverse Ranges (CTR). The SMM-SFV represents a block of the Peninsular Ranges terrane that is currently underthrusting the CTR, presumably as it continues to rotate clockwise. There is a symmetrical south-dipping wide-angle-/vertical-incidence-reflective zone on the north side of the San Andreas fault, but it is not clear if this structure is currently active. Moho is depressed to a maximum depth of 36 km beneath the CTR, defining a crustal root similar to that on LARSE line 1, 70 km to the east, but with smaller relief (3-5 km vs. 5-8 km), and it is centered ~5 km south of the SAF. The SAF appears to offset all layers above the Moho.

SCEC/UseIT: Focal Mechanism Evolution and Smarter TextGarcia, Joshua (USC)

The UseIT program combines undergraduates among a variety of majors to create information technology products for use in understanding and studying earthquakes. This UseIT summer grand challenge involved creating an earthquake monitoring system using the software project, SCEC-VDO, started by last years intern group. My contribution to the program this summer was two-fold. The first part of my contribution involved creating a new representation for focal mechanisms in SCEC-VDO.

130

Page 128: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

These focal mechanisms are represented using orthogonal intersecting discs. These discs representations allow easier identification of the plane on the focal mechanism that may lie on an actual fault. A new graphical user interface (GUI) was implemented to allow easy and intuitive use of this new focal mechanism representation. An additional GUI was created to allow users to select any colors they wish for compression or extension.

The second part of my contribution involved updating the text plug-in of SCEC-VDO to allow smarter and more flexible placement of text at any point defined by latitude, longitude and altitude on the SCEC-VDO display screen. New features for the text plug-in include file format flexibility, master list searching, and a text table. A new GUI was created for this feature as well.

Complexity as a Practical Measure for Seismicity Pattern EvolutionGoltz, Christian (UCD)

Earthquakes are a "complex" phenomenon. There is, however, no clear definition of what complexity actually is. Yet, it is important to distinguish between what is merely complicated and what is complex in the sense that simple rules can give rise to very rich behaviour. Seismicity is certainly a complicated phenomenon (difficult to understand) but simple models such as cellular automata indicate that earthquakes are truly complex. From the observational point of view, there exists the problem of quantification of complexity in real world seismicity patterns. Such a measurement is desirable, not only for fundamental understanding but also for monitoring and possibly for forecasting. Maybe the most workable definitions of complexity exist in informatics, summarised under the topic of algorithmic complexity. Here, after introducing the concepts, I apply such a measure of complexity to temporally evolving real-world seismicity patterns. Finally, I discuss the usefulness of the approach and regard the results in view of the occurrence of large earthquakes.

Attenuation of Peak Ground Motion from June 2005 Anza and Yucaipa California Earthquakes

Graizer, Vladimir (CGS) and Anthony Shakal (CGS)

A large amount of strong motion data was recorded during the two recent earthquakes in Southern California: Mw 5.2 (ML 5.6) event of 6/12/2005 near Anza, and Mw 4.9 (ML 5.3) event near Yucaipa. The Anza earthquake occurred within the San Jacinto fault zone and had strike-slip fault mechanism. The Yucaipa earthquake occurred within the San Andreas fault zone and had dip-slip mechanism (predominantly thrust motion with a small left-lateral component).

The two data sets include strong-motion data recorded by the California Strong Motion Instrumentation program, USGS National Strong Motion Program, Southern California Network and Anza Regional Network. Anza earthquake data set includes 279 data points, and Yucaipa data set includes 388 points. The data sets were limited by the triggering level of strong-motion accelerographs (0.005 g), to assure

uniform and complete representation. We don’t recommend merging these data sets with data of lower amplitude (mostly recorded by velocity sensors) at this time.

Comparison was made of the five existing attenuation relationships (Abrahamson & Silva, 1997; Boore, Joyner, Fumal, 1997; Campbell, 1997; Idriss, 1991; Sadigh et al., 1997) with the recorded data for the distances of up to 150-200 km (all those attenuation relationships were designed for distances of up to 80 km). For Anza and Yucaipa earthquakes attenuation curves generally under predict recorded peak ground motions.

131

Page 129: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Kinematic Rupture Model GeneratorGraves, Robert (URS) and Arben Pitarka (URS)

There are several ongoing efforts within SCEC that involve the use of ground motion simulations for scenario earthquakes. These include the NSF Project on Implementation Interface (NGA-H, structural response simulations) and Pathway II components of CME. Additionally, developments are underway within the Seismic Hazard Analysis focus group to implement time history generation capabilities (e.g., CyberShake). A key component that these projects all depend upon is the ability to generate physically plausible earthquake rupture models, and to disseminate these models in an efficient, reliable and self-consistent manner. The work presented here takes the first step in addressing this need by developing a computation module to specify and generate kinematic rupture models for use in numerical earthquake simulations. The computational module is built using pre-existing models of the earthquake source (e.g., pseudo-dynamic, K(-2) wavenumber decay, etc…). In the initial implementation, we have employed simple rules to compute rupture initiation time based on scaling of the local shear wave velocity. The slip velocity function is a simplified Kostrov-like pulse, with the rise time given by a magnitude scaling relation. However, the module is not restricted to accept only these parameterizations, but is instead constructed to allow alternative parameterizations to be added in a straightforward manner. One of the most important features of the module is the use of a Standard Rupture Format (SRF) for the specification of kinematic rupture parameters. This will help ensure consistent and accurate representation of source rupture models and allow the exchange of information between various research groups to occur in a more seamless manner.

Grid Activities in SCEC/CMEGullapalli, Sridhar, Ewa Deelman, Carl Kesselman, Gaurang Mehta,

Karan Vahi, and Marcus Thiebaux (USC/ISI)

The Center for Grid Technologies Group at USC/ISI is an integral part of the SCEC/CME effort, software and services.

Cybershake:The purpose of this critical activity is to create Probabilistic Seismic Hazard curves for several sites in the Los Angeles area. The SCEC/CME Project brings all the elements necessary to perform this work including the Probabilistic Seismic Hazard Analysis (PSHA), the geophysical models, the validated wave propagation tools, the scientific workflow capabilities, and the data management capabilities. CGT contributed towards all aspects of the planning, development and execution of the workplan for the Cybershake activity.

Gridbased Workflow Tools:CGT staff have been working to develop, customize and integrate Grid-based Workflow tools including Pegasus, VDS, Condor-G, DAGMAN etc. into SCEC/CME to enable SCEC scientists to focus on geophysics problems, automating and abstracting the details of execution of these large simulations. In the coming year, CGT will develop a production quality set of tools for use by SCEC earthquake scientists to ease the task of performing repetitive and similar tasks.

Integrated workflow solutions with AI based Knowledge and Semantic Tools:CGT will continue to work as part of a close team with our expert colleagues in the Artificial Intelligence discipline to design, develop and harden an integrated workflow solution, coupling Grid-based Workflow tools with Knowledge and Semantic tools to meet requirements of the SCEC proposal.

132

Page 130: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Visualization tools and Services:The visualization efforts at ISI have been focused on developing an interactive browsing framework for large SCEC datasets like Terashake, LA 3D and Cybershake.

Distributed SCEC Testbed deployment and operation:The Grid Group works closely with our colleagues at SCEC, USC, ISI and the TeraGrid to build and customize the distributed computational and data storage infrastructure for the SCEC scientist required for SCEC Pathway executions under the Community Modeling Environment and the SCEC Virtual Organization. Computational resources, software and an integrated environment to solve seismic modeling problems of interest to the SCEC scientists span multiple organizations including SCEC, USC, ISI and the distributed TeraGrid infrastructure.

Discrete Element Simulations of Elasto-Plastic Fault Block InteractionsGuo, Yonggui (Rice University) and Julia K. Morgan (Rice University)

In order to gain a better understanding of earthquake distributions, we carry out Distinct Element simulations in 2D to simulate the brittle failure and deformation within and between two interacting fault blocks. Fault blocks composed of bonded particles are sheared along a linear fault surface, allowing for block fracture and fragmentation. Deformation within fault blocks are driven either by lateral boundary displacement or by basal boundary displacement, coupled by elastic springs to interacting particles at the surface. Our preliminary results show that the early phase of fault deformation is accommodated by development of asymmetric tensile fractures and wear of the fault surface. When the gouge zone is sufficiently thick to completely separate the opposing fault surfaces, shear strain is accommodated mainly by shearing of interlocking gouge grains along locally developed shear surfaces, resulting in much lower shear stress within the gouge zone. The results suggest that except for boundary condition and physical properties of fault blocks, internal structures evolving from fault-fault interaction configuration to fault-gouge interaction configuration, also has a significant effect on fault zone dynamic process.

Poroelastic Damage Rheology: Dilation, Compaction and Failure of RocksHamiel, Yariv (SIO, UCSD), Vladimir Lyakhovsky (Geological Survey of Israel), and

Amotz Agnon (Hebrew University of Jerusalem)

We present a formulation for mechanical modeling of interaction between fracture and fluid flow. The new model combines the classical Biot's poroelastic theory with a damage rheology model. The theoretical analysis based on the thermodynamic principles, leads to a system of coupled kinetic equations for the evolution of damage and porosity. Competition between two thermodynamic forces, one related to porosity change and one to microcraking, defines the mode of macroscopic rock failure. At low confining pressures rock fails in a brittle mode, with strong damage localization in a narrow deformation zone. The thermodynamic force related to microcraking is dominant and the yield stress increases with confining pressure (positive slope for yield curve). The role of porosity related thermodynamic force increases with increasing confining pressure, eventually leading to decrease of yield stress with confining pressure (negative slope for yield curve). At high confining pressures damage is non-localized and the macroscopic deformation of the model corresponds to experimentally observed cataclastic flow. In addition, the model correctly predicts different modes of strain localization such as dilating shear bands and compacting shear bands. We present numerical simulations in 3D that demonstrate rock-sample deformation at different modes of failure. The simulations reproduce the gradual transition from brittle fracture to cataclastic flow. The development provides an internally consistent framework for simulating coupled evolution of fracturing and fluid

133

Page 131: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

flow in a variety of practical geological and engineering problems such as nucleation of deformation features in poroelastic media and fluid flow during seismic cycle.

SCEC/UseIT: Animation in SCEC-VDOHaqque, Ifraz (USC)

UseIT is an intern program organized by SCEC annually to provide college students with an opportunity to conduct research in information technology related to geoscience.

This summer’s grand challenge for the SCEC IT interns involved expanding SCEC-VDO, the current earthquake visualization software developed over the previous year so that it may be used as a real time earthquake monitoring system. Our team was broken down into several groups each involved in some aspect of developing the software to meet our goals. As part of the Earthquake Analysts group, my primary contribution to the software was to provide the ability to animate earthquakes. This functionality enables scientists to look at sequences of earthquakes and how they occurred relative to each other with respect to time and it brought the software as a whole towards being a true earthquake monitoring system. Integrating the animation code with the code that already existed proved to be a difficult task to begin with. However, it was easily overcome as I spent more time analyzing and familiarizing myself with the existing code. All in all, my time at SCEC has vastly improved my software development skills as well as provided me with the valuable experience of working as a team.

A New Focal Mechanism Catalog for Southern CaliforniaHardebeck, Jeanne (USGS), Peter Shearer (Scripps, UCSD), and

Egill Hauksson (Caltech)

We present a new focal mechanism catalog for southern California, 1984-2003, based on S-wave/P-wave amplitude ratios and catalog P-wave first motion polarities. The S/P ratios were computed during the SCEC-sponsored Caltech/UCSD analysis of the entire waveform catalog (Hauksson and Shearer, BSSA 95, 896-903, 2005; Shearer et al., BSSA 95, 904-915, 2005) and form the most complete set of S/P ratios available for southern California. The focal mechanisms were computed with the technique of Hardebeck and Shearer (BSSA 92, 2264-2276, 2002; BSSA 93, 2434-2444, 2003), which estimates mechanism quality from the solution stability with respect to input parameter perturbations. The dataset includes more than 24,000 focal mechanisms, and highest-quality solutions were found for 6380 earthquakes, most having M1.5-3.5. The focal mechanism catalog is available from the Southern California Earthquake Data Center alternate catalogs web page: http://www.data.scec.org/research/altcatalogs.html.

Homogeneity of Small-Scale Earthquake Faulting, Stress and Fault StrengthHardebeck, Jeanne (USGS)

Small-scale faulting at seismogenic depths in the crust appears to be more homogeneous than previously thought. For example, Rivera & Kanamori [GRL 29, 2002] found focal mechanisms oriented "in all directions" in southern California, implying a heterogeneous stress field and/or heterogeneous fault strength. Surprisingly, I obtain very different results for three new high-quality focal mechanism datasets of small (M<~3) earthquakes in southern California, the east San Francisco Bay area, and the aftershock sequence of the 1989 Loma Prieta earthquake. The difference may be that I used only high-quality mechanisms, and that I consider the statistical uncertainty in the mechanisms. I quantify the degree of mechanism variability on a range of length scales by comparing

134

Page 132: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

the hypocentral distance between every pair of events and the angular difference between their focal mechanisms. Closely-spaced earthquakes (inter-hypocentral distance 0-2 km) tend to have very similar focal mechanisms, often identical to within the 1-sigma uncertainty of ~25°. Small-scale focal mechanism similarity is observed in strike-slip (e.g. East Bay), thrust (e.g. Northridge aftershocks) and normal-faulting (e.g. southern Sierra Nevada) regimes. This observed similarity implies that in small volumes of crust, while faults of many orientations may be present, only similarly-oriented fault planes produce earthquakes contemporaneously. On these short length scales, the crustal stress and fault strength (coefficient of friction) are presumably homogeneous as well, to produce such similar earthquakes. Over larger length scales (2-50 km), focal mechanisms become more diverse with increasing inter-hypocentral distance (differing by 40°-70°.) Mechanisms over ~50 km length scales can be explained by relatively small variations (~30%) in stress or fault strength, since different mechanisms can be generally consistent with the same stress field (e.g. thrust and strike-slip faulting accommodating the same shortening direction.) It is possible that most of this small apparent heterogeneity in stress or strength comes from measurement error in the focal mechanisms, as negligible variation in stress or fault strength (<10%) is needed if each earthquake is assigned the optimal focal mechanism within the 1-sigma confidence region. This local homogeneity in stress and fault strength is encouraging, implying it may be possible to measure these parameters with enough precision to be useful in studying and modeling larger earthquakes.

A Scientific Hypothesis Challenged by Parkfield Earthquakes – Material Contrasts and Rupture Propagation Direction

Harris, Ruth (USGS)

It has been proposed by a number of authors that a material contrast (different shear modulus = different density x Vs**2) across a fault will lead to unilateral rupture propagation in the slip-direction of the less-rigid material, hereafter called the single-propagation-direction hypothesis. In contrast, Harris and Day [BSSA, 1997] proposed that a fault-aligned bimaterial interface would lead to rupture behavior that was affected by the material contrast, but bilateral rupture propagation would still occur. Both of these hypotheses were based on numerical simulations of spontaneous rupture propagation in an elastic medium. Earthquakes in the Parkfield, California region of the San Andreas fault provide an excellent test of these competing hypotheses. The slip-direction of the NW-SE striking San Andreas fault is right-lateral strike slip and lower-velocity (‘softer’) material resides on the northeast side of the fault. Therefore the single-propagation-direction hypothesis implies that Parkfield earthquakes should be NW to SE propagating ruptures. Observations of two of the largest seismologically-recorded earthquakes at Parkfield seem to support this hypothesis, with the 1966 and 1934 mainshocks propagating NW to SE. However, investigations of the 17 minute M5 foreshocks of the 1934 and 1966 earthquakes showed SE to NW propagation [Bakun and McEvilly, 1981]. Smaller, M4 earthquakes in 1992, 1993, and 1994 also showed arbitrary rupture propagation direction, inconsistent with material-contrast-predicted single-direction behavior [Fletcher and Spudich, 1998]. The occurrence of the 2004 M6 earthquake, with SE to NW rupture direction, also contradicts the single-propagation-direction hypothesis. As proposed by Harris and Day [1997], although rupture behavior can easily be affected by material contrast, propagation direction is not pre-determined.

Attenuation Models (QP and QS) in Three-Dimensions of the Southern California Crust: Inferred Evidence for Dry Crust and Wet Faults

Hauksson, Egill (Caltech) and Peter Shearer (UCSD)

We analyze high-fidelity waveform spectra to determine t* values for both P- and S-waves from earthquakes in southern California. We invert the t* values for three-dimensional (3D) frequency-independent QP and QS regional models of the crust. The models have 15 km horizontal grid spacing

135

Page 133: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

and an average vertical grid spacing of 4 km, down to 30 km depth, and extend from the US-Mexico border in the south to the Coast Ranges and Sierra Nevada in the north. In general, QP and QS increase rapidly with depth, consistent with crustal densities and velocities. The 3D QP and QS models image prominently the major tectonic structures and to a much lesser extent the thermal structure of the southern California crust. The near-surface low QP and QS zones coincide with major sedimentary basins such as the San Bernardino, Chino, San Gabriel Valley, Los Angeles, Ventura, Santa Maria basins, and the Salton Trough. In contrast, at shallow depths beneath the Peninsular Ranges, southern Mojave Desert and southern Sierras, we image high QP and QS zones, which correspond to the dense and high velocity rocks of the mountain ranges. Several clear transition zones of rapidly varying QP and QS coincide with major late Quaternary faults and connect regions of high and low QP and QS. At mid-crustal depths the QP and QS form imbricate stacks of slightly higher and lower QP or QS zones, which is consistent with reported crustal reflectivity. In general, for the southern California crust, QS/QP is greater than 1.0, suggesting dry crust. A few limited regions of QS/QP less than 1.0 correspond to areas around some of the major strike-slip faults and the Salton Trough and suggest a larger reduction in the shear modulus compared to the bulk modulus or fluid saturation.

Associating Southern California Seismicity with Faults Using Bayesian Inference

Hauksson, Egill (Caltech), Rob Wesson (USGS), Peter Shearer (UCSD), and John Shaw (Harvard)

The gravity signal contains information regarding changes in density at all depths and can be used as a proxy for the strain accumulation in fault networks. Using a stress-evolution time-dependent model, simulated slip histories were created for the San Andreas Fault network in California. From the slip histories generated, and through the use of gravity Green's functions, we generated a time-dependent model of how the gravity evolves for the San Andreas Fault Network. The steady-state, long term component of gravity produces a signal with a magnitude of approximately 0.002 mgal over a 5 year acccumlation period; the limit of current portable instrument observations. Signals generated from moderate to large events produce magnitudes in the range of 0.060 mgal to 0.080 mgal, well within observational limits. The complex fault geometry contributes to the shape of the spatial distribution of the various gravity signals observed; the signal magnitude is constrained by the amount slip modeled. The gravity signal over long periods exhibit highly complex patterns which reflect the geometry of the system and the constant loading of stress and strain from the long term plate motion.

Gravity Changes Over California Using a Time-Dependent Earthquake ModelHayes, Tyler (UWO), Kristy Tiampo (UWO), and John Rundle (UC Davis)

The gravity signal contains information regarding changes in density at all depths and can be used as a proxy for the strain accumulation in fault networks. Using a stress-evolution time-dependent model, simulated slip histories were created for the San Andreas Fault network in California. From the slip histories generated, and through the use of gravity Green's functions, we generated a time-dependent model of how the gravity evolves for the San Andreas Fault Network. The steady-state, long term component of gravity produces a signal with a magnitude of approximately 0.002 mgal over a 5 year acccumlation period; the limit of current portable instrument observations. Signals generated from moderate to large events produce magnitudes in the range of 0.060 mgal to 0.080 mgal, well within observational limits. The complex fault geometry contributes to the shape of the spatial distribution of the various gravity signals observed; the signal magnitude is constrained by the amount slip modeled. The gravity signal over long periods exhibit highly complex patterns which reflect the geometry of the system and the constant loading of stress and strain from the long term plate motion.

136

Page 134: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Estimating Stress Heterogeneity from Aftershock RateHelmstetter, Agnes (LDEO Columbia) and Bruce Shaw (LDEO Columbia)

We estimate the rate of aftershocks triggered by a heterogeneous stress change, using the rate-and-state model of Dieterich [1994]. We show than an exponential stress distribution P(S)~exp(-S/S0) gives an Omori law decay of aftershocks with time R(t)~1/tp, with an exponent p=1-A Sn/S0, where A is a parameter of the rate-and-state friction law, and Sn the normal stress. Omori exponent p thus decreases if the stress "heterogeneity" S0 decreases. We also invert the stress distribution P(S) from the seismicity rate R(t), assuming that the stress does not change with time. We apply this method to a synthetic stress map, using the (modified) scale invariant "k2" slip model [Herrero and Bernard, 1994]. We generate synthetic aftershock catalogs from this stress change. The seismicity rate on the rupture area shows a huge increase at short times, even if the stress decreases on average. Aftershocks are clustered in the regions of low slip, but the spatial distribution is more diffuse than for a simple slip dislocation. Because the stress field is very heterogeneous, there are many patches of positive stress changes everywhere on the fault. This stochastic slip model gives a Gaussian stress distribution, but nevertheless produces an aftershock rate which is very close to Omori's law, with an effective p<=1, which increases slowly with time. The rate-and-state model cannot explain an Omori law with p>1, unless the stress decreases with time after the mainshock. The inversion of the full stress distribution P(S) is badly constrained for negative stress values, and for very large positive values, if the time interval of the catalog is limited. However, constraining P(S) to be a Gaussian distribution allows a good estimation of P(S) for a limited number of events and catalog duration.

Status Report on Plate Boundary Observatory GPS Data AnalysisHerring, T., R. King, S. McClusky (MIT), M. Murray (Berkeley Seismological

Laboratory ),V. M. Santillan, T. Melbourne (CWU), and G. Anderson (UNAVCO)

The Plate Boundary Observatory GPS data analysis centers (ACs) at the Berkeley Seismological Laboratory (BSL) and Central Washington University (CWU), and the analysis center coordinator (ACC) at the Massachusetts Institute of Technology began establishing the GPS processing centers on April 1, 2005. The PBO GPS data analyses will be operational on October 1, 2005, with the regular delivery of daily SINEX solution files with full covariance and time series files designed for ease of analysis and plotting. The initial results for the PBO analyses will start with data from January 1, 2004 and contain position time series for 209 PBO Nucleus stations, 17 IGS reference frame stations, 9 CORS stations (to bridge Alaska to the continental United States), and all PBO stations as they come on line. The first PBO site was ready on January 11, 2004. The PBO ACs routinely generate rapid orbit analyses (1-day latency), primarily to check data quality, and initial final orbit analyses with 6-13 day latency. The timing of the generation of these products is based on the availability of the International GNSS Service rapid and final orbit products. Currently, between 280 and 300 stations are included in these rapid analyses and typically 310-315 stations are in the final analyses. The initial testing of the analysis procedures shows median north and east daily root-mean-square (rms) scatters of 1.0-1.3 mm for horizontal positions for a network encompassing North America and Central Pacific. The median height rms scatter is 5 mm. The poster will show preliminary results and products being generated by the PBO ACs and ACC, and compare the results between the GIPSY (CWU) and GAMIT (BSL) processing.

137

Page 135: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Seismicity on a Fault Controlled by Rate- and State-Dependent Friction with Spatial Variations of the Critical Slip Distance

Hillers, G. (ETH), Y. Ben-Zion (USC) and P.M. Mai (ETH)

We perform systematic simulations of slip using a quasi-dynamic continuum model of a 2-D strike-slip fault governed by rate-and state-dependent friction. The depth-dependence of the a-b and L frictional parameters are treated in an innovative way that is consistent with available laboratory data and multi-disciplinary field observations. Various realizations of heterogeneous L distributions are used to study effects of structural variations of fault zones on spatio-temporal evolution of slip. We demonstrate that such realizations can produce within the continuum class of models realistic features of seismicity and slip distributions on a fault. We explore effects of three types of variable L distributions: (1) A depth dependent L profile accounting for the variable width of fault zones with depth. (2) Uncorrelated 2-D random distributions of L with different degrees of heterogeneity. (3) A hybrid distribution combining the depth-dependent L profile with the 2-D random L distributions. The first type of L distribution generates system-wide stick-slip events. As L increases toward the bottom of the fault, the spatio-temporal slip evolution becomes irregular. The 2-D heterogeneous parameterizations generate frequency-size statistics with event sizes spanning four orders of magnitude and slip events with properties similar to those of natural earthquakes. Our results indicate that different degrees of heterogeneity of L distributions control (1) the number of simulated events and (2) the overall stress level and fluctuations. Other observable trends are (3) the dependency of hypocenter location on L and (4) different nucleation phases for small and large events in heterogeneous distributions.

Earthquake Forecasting Using Pattern InformaticsHolliday, James R. (UC Davis), Chien-chih Chen (UC Davis),

Kristy F. Tiampo (U of Western Ontario), John B. Rundle (UC Davis), and Donald L. Turcotte (UC Davis)

We present a method to forecast earthquakes that is based on the identification of regions having large fluctuations in seismicity. In a forecast map originally published on February 19, 2002, the Pattern Informatics (PI) forecast for southern California successfully forecasted the locations of the great majority of significant earthquakes in southern California during the past three years. Recent advances in the PI method show considerable improvement in the apability to forecast earthquake locations, especially when compared to a map of maximum relative intensity, the hypothesis that significant earthquakes of the future will occur where most of the small earthquakes (highest intensities) have occurred in the past. These methods have been successfully applied to California, to Japan, and on a worldwide basis. The current application requires probabilities for each location for a number of magnitude bins over a five year period. We have therefore constructed a hybrid forecast in which we combine our PI method with historic seismicity data and an assumed small background rate to compute a map of probabilities for events occurring at any location, rather than just the most probable locations. These probabilities can be further converted, using Gutenberg-Richter scaling laws, to anticipated rates of future earthquakes that can be evaluated using the RELM test.

Anomalous Seismic Amplitudes Measured in the Los Angeles Basin Interpreted as a Basin-Edge Diffraction Catastrophe

Husker, Allen (UCLA)

The Los Angeles Basin Passive Seismic Experiment (LABPSE) involved the installation of an array of eighteen seismic stations along a line crossing the Los Angeles basin from the foothills of the San Gabriel Mountains through the Puente Hills to the coast. At 3-5 km spacing between stations the array has much higher resolution than the permanent network of stations in southern California. This

138

Page 136: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

resolution was found to be important for analyzing the factors that govern the amplitude variation across the basin. We inverted spectra of P and S body wave seismograms from local earthquakes (ML = 2.1 - 4.8) for site effects, attenuation, and corner frequency factor using a standard model that assumes geometric spreading varying as inverse distance, exponential attenuation, and an source model. The S wave attenuation was separable into basin and bedrock contributions. In addition to the body wave analysis, S wave coda were analyzed for coda Q and coda-determined site effects. We find S wave ( ) in bedrock is higher than in the basin. High frequency is higher than low frequency . Coda ( ) is higher than . P wave ( ) was not separable into basement and bedrock values, so we determined an average value only. The corner frequencies for P and S waves were found to be nearly the same. The standard model fit over 97% of the S wave data, but data from six clustered events incident along the basin edge within a restricted range of incidence and azimuth angles generated anomalous amplitudes of up to a factor of 5 higher than predicted. We test whether such basin-edge focusing might be modeled by catastrophe theory. After ruling out site, attenuation and radiation effects, we conclude a caustic modeled as a diffraction catastrophe could explain both the frequency and spatial dependence of the anomalous variation.

Early Pleistocene Emergence of New Dextral Faults SW of the Southern San Andreas Fault, Salton Trough

Janecke, Susanne U. (USU), Stefan Kirby (USU), Alexander Steely (USU), Andrew Lutz (U of O), Rebecca Dorsey (U of O), Bernard Housen (W Wash.), and Victoria Langenheim (USGS)

Results of field studies and magnetostratigraphy in the SW Salton trough suggest reorganization of the plate boundary fault system spanning the 1.07 to 0.99 Ma Jaramillo subchron. From late Miocene to late Pliocene time the West Salton detachment fault controlled sedimentation in the SW Salton Trough. Starting around 1.1 Ma, a new set of cross-cutting dextral and dextral oblique-slip faults offset, folded, and deactivated most of the West Salton detachment. Emergence of the Fish Creek Mountains fault and growth of the transpressive San Felipe anticline during deposition of the 1.1 to 0.5 Ma Ocotillo Formation are the first signs of a new structural regime. An angular unconformity across the crest of the E-W trending, basement-cored San Felipe anticline changes north, south and east into a disconformable and conformable contact north and south. Magnetostratigraphic dating of this progressive unconformity at a disconformable, conformable, and nearly conformable contact shows that its age is identical (~1.1 Ma) across the San Felipe-Borrego subbasin.

The dextral-normal Fish Creek Mountains fault uplifted older basin-fill deposits and basement in the Fish Creek and Vallecitos Mountains and provided source material for the Ocotillo and Brawley formations. The Fish Creek Mountains fault is part of the San Felipe fault zone, which currently stretches from the Elsinore fault in the NW to the San Jacinto fault in the SE. Southwestward coarsening and thickening of the Ocotillo and Brawley formations toward the Fish Creek Mountains fault zone, NE- to E-directed paleocurrents, and crystalline and recycled sandstone clasts shed from newly uplifted older basin-fill deposits and underlying basement show that this fault uplifted the Fish Creek and Vallecito mountains for the first time in early Pleistocene time (~1.1 Ma). This uplift separated the formerly contiguous Fish Creek-Vallecito subbasin from the San Felipe-Borrego subbasin, and drove progradation of a 600 m thick sheet of Ocotillo-Brawley gravel and sand 25 km NE into the former perennial lake basin of the Borrego Formation. By ~1.0 Ma the Clark Lake and Santa Rosa segments of the Clark strand of the San Jacinto fault zone were shedding gravel SW into the northern San Felipe-Borrego subbasin. By 0.9 to 0.8 Ma, the NW end of the Coyote Creek strand along Coyote Canyon had developed NW-flowing streams that deposited sandstone and conglomerate within the fault zone. At about 0.9 Ma the Fish Creek–Vallecito subbasin ended a 7 m.y. period of subsidence and began to be exhumed. Studies in the San Felipe Hills and Borrego Badlands show that the modern geometry of the San Jacinto fault zone appeared after 0.6 to 0.5 Ma.

139

Page 137: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

SCEC/UseIT: That Earthquake is Somewheres FaultJennings, Jadd (Temple University)

UseIT is a SCEC-intern research program that is sponsored by the National Science Foundation. As a UseIT intern I engaged in team-related research projects. All these projects were based on the grand challenge of creating an earthquake monitoring system. This system will allow for improved techniques of informing people of important earthquake information. Currently SCEC has developed a Java3D based program called SCEC VDO (Virtual Display Objects). The program gives users a three-dimensional view of Southern California’s geologic and seismologic features. Interns have taken advantage of its ability to import user-created plug-ins.

I worked with the Fault Mapping team. In this team, I toiled with projects that involved the integration of USGS fault data into SCEC-VDO. The data was contained in a text file; so I was required to develop a plug-in that would parse and display the data. We also developed a plug-in that allowed the display of faults from the seismic hazard analysis open source code.

Some of my work was challenging because of the Java3D Application Program Interface (API). The Java3D API is a group of java packages used for audio/visual related programming I had to research on some of its classes/methods. This challenge benefited me, because I was able to expand my knowledge of Java3D.

Coseismic and Postseismic Slip of the 2004 Parkfield Earthquake from Space-Based Geodetic Data

Johanson, Ingrid A. (UC Berkeley), Eric J. Fielding (JPL/Caltech), Frederique Rolandone (UPMC Paris VI), and Roland Bürgmann (UC Berkeley)

We present the results of an inversion of spaced-based geodetic data for slip in the coseismic and postseismic periods of the 2004 Parkfield earthquake. The data is a combination of InSAR data from the ENVISAT and RADARSAT satellites, campaign data collected by UC Berkeley and the USGS, and continuous GPS data from the SCIGN and PBO networks. The InSAR data consists of eight interfer-ograms spanning the earthquake and variable portions of the postseismic period.The model assumes that slip in the postseismic period evolves as an exponential decay. In the inversion, we simultaneously solve for coseismic slip and the amplitude of the postseismic exponential, thus the postseismic slip distribution is not allowed to evolve with time. We obtain a value of 0.14 years for the decay time constant of the postseismic slip by fitting time-series of GPS and creepmeter data. This decay time constant implies that 95% of the postseismic slip occurred within 150 days of the earthquake. The model resolves a total geodetic moment magnitude of Mw6.3 for the coseismic and postseismic periods. Given a seismic moment of Mw6.1, this suggests that 40 - 50% of the slip occurred aseismically. The coseismic rupture happened mainly in two high-slip asperities; the smaller occurring near the hypocenter and the larger occurring 10-20 km north. Shallow postseismic slip took place mainly on the fault areas surrounding the coseismic rupture, including the rupture areas of two Mw5.0 aftershocks.

140

Page 138: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

What Components of the Mainshock Seismic Waves (Frequency, Amplitude, Duration) Trigger Aftershocks?

Kane, Deborah (IGPP/SIO), Debi Kilb (IGPP/SIO), Arthur Berg (UCSD) and Vladik Martynov (IGPP/SIO)

Though remotely triggered earthquakes have been observed in geothermal areas, little evidence exists revealing such triggering in non-geothermal areas. We look for evidence of remote triggering in southern California using data from fourteen 3-component ANZA seismic network stations. We first identify eight local mainshocks (M>3.2) with obvious aftershock sequences and eight local mainshocks (M>3.0) that lack an obvious aftershock sequence. For these end-member models we apply three different statistical tests (Binomial, Wilcoxon Ranksum and Kolmogorov-Smirnov), which quantitatively confirm the triggering (non-triggering) nature of these local mainshocks. We find that the triggering events generally reach higher spectral amplitudes than the non-triggering events, particularly for frequencies in the range 0.1-10 Hz. Because an aftershock has no knowledge of where (i.e., distance and azimuth) the mainshock occurred, we assume that local and distant mainshocks can cause triggering when appropriate amplitude/frequency conditions are met. We estimate an amplitude/frequency triggering threshold consistent with that observed for local mainshock triggering (non-triggering) events. We assume this hypothesis should hold for most, but not necessarily all, remote mainshock earthquakes. To test this theory, we use the statistical tests to determine the capability of ~40 remote mainshocks (M>7.0) to trigger seismicity in southern California, and compare the spectra of these remote triggering (non-triggering) mainshocks with the local mainshock spectra to help constrain a frequency/amplitude triggering threshold.

Simulations of Earthquake Nucleation, Its Static Perturbation, and Aftershock Rates on Faults with Rate and State Friction

Kaneko, Yoshihiro (Caltech) and Nadia Lapusta (Caltech)

Large earthquakes are followed by increased seismic activity, usually referred to as aftershock sequences, that decays to the background rate over time. The decay of aftershocks is well-described empirically by Omori's law. Dieterich (JGR, 1994) proposed that Omori's law could result from perturbing, by a static stress step, a population of nucleation sites governed by laboratory-derived rate and state friction. He used one-degree-of-freedom spring-slider system to represent elastic interactions and made simplified assumptions about frictional behavior during nucleation. The model was further explored in a number of studies (i.e., Gomberg et al., JGR, 2000) and used to interpret observations (i.e., Toda et al., JGR, 1998).

In this study, we explore the consequences of Dieterich's approach using models of faults embedded in elastic continuum, where the nucleation process can be more complicated than assumed in Dieterich's model. Our approach is different from previous studies of aftershock rates with rate and state friction in that nucleation processes are simulated as a part of spontaneously occurring earthquake sequences, and hence initial conditions for the nucleation processes are determined by the model itself.

We find that nucleation processes in continuum models and the resulting aftershock rates are well-described by the model of Dieterich (1994) when Dieterich's assumption that the state variable of the rate and state friction laws is significantly behind its steady-state value holds during the entire nucleation process. On the contrary, aftershock rates in models where the state variable assumption is violated for a significant portion of the nucleation process exhibit behavior different from Dieterich's model. In particular, for nucleation at transitions between creeping and locked behavior, we observe delayed peaks of aftershock activity followed by seismic quiescence. The state variable assumption is significantly violated when stress heterogeneities are present within the nucleation zone. Aftershock

141

Page 139: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

rates are more affected by heterogeneities for larger values of the ratio a/b of rate and state parameters. (Note that 0 < a/b < 1 for steady-state velocity-weakening faults capable of producing earthquakes.) This is consistent with the recent study by Rubin and Ampuero (AGU, 2004) who show that if the ratio a/b is close to one, the state variable assumption will eventually be violated in the middle portions of the nucleation zone. When the computed aftershock rates are different from Dieterich’s model, aftershock rates shortly after the mainshock depend somewhat on the rate and state parameter b and the characteristic slip L, both of which do not affect aftershock rates in Dieterich’s model.

We study spatial effects due to non-uniform stress steps for nucleation at transitions between creeping and locked behavior, where, for a constant stress step, we observe delayed peaks of aftershock activity followed by seismic quiescence. The resultant aftershock rates follow power-law decay and then, several years later, drop to values much lower than the background seismicity rate. This result may explain previously observed turnoff of aftershock activity at the base of the seismogenic zone near the 1984 Morgan Hill earthquake (Tian and Rubin, Chapman meeting abstract, 2005).

Estimation of Spatial Variation of Site Response around Sendai City, Japan

Kenichi, Tsuda and Ralph Archuleta (US Santa Barbara)

The estimation of site effects on the observed ground motion is essential for the seismic hazard mitigation. In order to estimate these effects more quantitatively, a dense seismic array has been established near the city of Sendai, Japan. This city is close to one of the most active subduction zones in Japan and has a 99% probability of experiencing a M ≥ 7.5 earthquake in the next 30 years. The array consists of 29 stations: 20 managed by Tohoku Institute of Technology, 6 by BRI, and 3 by NIED within an area of 20 x 30 km². The records used in this study come from 8 moderate-sized events that occurred near Sendai. To estimate surface site response for all 29 stations, we have separated source and path effects from the observed records by using the method developed by Tsuda et al. (2005a). This method does not depend on a reference station. Using the attenuation model determined by Tsuda at al. (2005b), we first determined the source parameters seismic moment Mo and corner frequency fc by analyzing the recordings from 19 Kik-Net borehole accelerometers (same stations as used in Tsuda et al., 2005b). Having the source and path parameters, we computed the frequency-dependent surface site response for all the stations.

After we derived site response at each station, we have tried to predict the site response for a station where its response is unknown. In this study, we selected a K-net station in the Sendai array as the target. In order to predict the site response at this station, we introduced a “Site Correlation Coefficients (SCC)” which is based on two known site responses. The SCC is calculated for all the station pairs where the station separation distance is less than 5km. Of course, we excluded the K-net station that is our target. By using the results of SCC, we calculated averaged site responses as a function of frequency and distance within the range of 1-5 km. The agreement of 'Predicted' site response with the 'Derived' site response indicates that the ‘SCC’ is useful for predicting site response at given station that is within 5 km of a site with a known site response.

Finally, we compared the derived site response based on weak motion data from moderate sized events with the site response based on strong motion data from the 2003 (Mj 7.0) and 2005 (Mj 7.2) Miyagi-oki earthquakes. This comparison provides a good example for analyzing soil nonlinearity and its dependence on the level of the input motion.

142

Page 140: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

When Dots-On-Plots Just Don’t Cut It Any More – How to Use Interactive 3-D Visualizations to Explore the Fine-Scale Fault Structure of Southern California

(Based on Focal Mechanism Catalog Data)Kilb, Debi (IGPP/SIO), Jeanne Hardebeck (USGS), and

Kristoffer T. Walker (IGPP/SIO)

Fed up with GMT but just can’t seem to break free of the 2-D flat-map world? Too intimidated by the large learning curve for software required to manipulate and view your data in 3D? Resigned to the idea that whatever you need to do, you can do in MATLAB? An intermediate solution is to use the iView3D freeware (http://www.ivs.unb.ca/products/iview3d; runs on multiple platforms) to interactively explore geo-referenced 3D data, to easily toggle on/off different data from view, and to set topography and bathymetry maps transparent for aiding in correlation of surface and sub-surface features. With (HASH) focal mechanism catalogs, we have used the iview3D associated Fledermaus software to create 3-D visualizations of the fine-scale fault structure in southern California. The end products are 3-D visualizations that, for each of the >6000 individual earthquakes, include: (1) a sphere in 3-D space representing the earthquake’s latitude, longitude and depth, (2) a rectangle oriented with respect to the strike and dip of the fault (both nodal planes can be included), and (3) color coding to highlight differences among the data such as rake, dip, method used to compute the focal mechanisms (FPFIT or HASH), or temporal behavior. In this way, our initial results show that the fine-scale fault structure in southern California is extremely heterogeneous in comparison with the simple fault structure of the San Andreas Fault near Parkfield and the Hayward fault in the Bay area. Future plans are to scale the sub-faults by magnitude and extend our study region so we can compare and contrast fine-scale fault complexity in different tectonic settings and incorporate the larger-scale results from the SCEC Community Fault Model (CFM) project (http://structure.harvard.edu/cfm/). These visualizations will be distributed through the visual objects library at the SIO Visualization Center (http://www.siovizcenter.ucsd.edu/library/objects).

A GPS Anomaly in the San Gabriel Valley, CaliforniaKing, N.E. (USGS), D.C. Agnew (UCSD), Y. Bock (UCSD), R. Dollar (USGS), T. Herring (MIT), L. Jones (USGS), T. Jordan (USC), J. Langbein (USGS),

E. Reichard (USGS), and F. Webb (JPL)

GPS time series typically contain a linear trend, offsets caused by earthquakes and equipment changes, and fluctuations that are some combination of white, power law, and bandpass noise. Time series may also contain annual cycles, postseismic transients, or hydrologic signals due to groundwater pumping and withdrawal (Bawden et al., 2001; Argus et al., 2005). In 2005, during a season of extremely heavy rainfall in southern California, large transients appeared in independently processed time series from the U.S. Geological Survey’s “Earthquake Hazard Program,” the Scripps Institution of Oceanography’s “Scripps Orbital and Permanent Array Center, “and the Jet Propulsion Laboratory’s “GPS Data Products for Solid Earth Science” project. These anomalies were observed at stations in the eastern San Gabriel Valley. They appear to begin at 2005.0 and coincide with an abrupt 50 foot increase in groundwater elevation in a San Gabriel Valley water well. To rigorously choose the best-fitting start time of the transient and evaluate its statistical significance, it is necessary to characterize the error spectrum. We used the maximum likelihood method to choose the best error model and estimate its parameters (Langbein, 2004; Williams et al., 2004). Then we ran a moving window (1 year before a “hinge point” and 6 months after) through the time series for five San Gabriel Valley stations, calculating the rate change and its uncertainty at the hinge point. The rate changes are largest for hinge points of 2005.0, with significance levels of 4 to 8 standard deviations for the horizontal components. Vertical rate changes are also significant at some sites, although less so because the vertical data are noisier. We quantified the signal at 2005.0 by modeling the data before this time as the sum of a linear trend, an annual sinusoid, and offsets as appropriate. We extrapolated the model from 2005.0 to the end of the time series and define the signal as the

143

Page 141: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

cumulative departure from the extrapolated value. The horizontal signals show an outward-directed radial pattern in the eastern San Gabriel Valley, with magnitudes of 9 to 14 mm. The less significant vertical signals show uplift of 3 to 17 mm. Recent data suggest that the rate changes seen are slowing or even reversing, consistent with a hydrologically-induced transient.

Frictional Strength of the Punchbowl Fault Ultracataclasite at Seismic Slip Rates

Kitajima, Hiroko (Texas A&M U), Judith S. Chester (Texas A&M U), Toshihiko Shimamoto (Kyoto U), and Frederick M. Chester (Texas A&M U)

Several mechanisms that can lead to a reduction in frictional strength at seismic slip rates have been identified in laboratory friction experiments, but at this time we are uncertain which mechanisms are most important in natural seismic faulting. To further investigate the frictional behavior of natural faults, we have conducted high-speed friction tests of ultracataclasite from the Punchbowl fault, an exhumed, large-displacement fault in San Andreas system that juxtaposes the Punchbowl Formation and crystalline igneous and metamorphic rock of the San Gabriel basement complex. The ultracataclasite consists of extremely fine particles produced by comminution of host rock with some syn- and post-faulting alteration to zeolite and clay. Two samples of ultracataclasite are used in the experiments: DP4F from near the contact with the Punchbowl Formation and DP189A from near the contact with the crystalline basement. Both samples contain smectite and zeolite, and were prepared for experiments by disaggregating to particle sizes less than 100 micron diameter.

The disaggregated ultracataclasite was sheared between sawcut cylinders of granite in a high-velocity rotary apparatus at Kyoto University. Frictional strength was measured as a function of displacement to approximately 80 m at a slip speed of 0.1, 0.7 and 1.3 m/s, normal stress of 0.2, 0.6, and 1.3 MPa, and after pre-compaction for times up to several hours. To facilitate study of the microstructural evolution, samples were sheared to different total displacements, from 1.5 m to 80 m. At 1.3 m/s, the friction coefficient rapidly increases to 1.2 then gradually decreases to 0.2 over a slip-weakening distance (Dc) of about 10 m. At the lower speed of 0.1m/s, the coefficient friction is about 0.8 and there is little change in strength with slip. At high slip rates, Dc decreases with an increase in normal stress. Precompaction tends to increase the initial peak frictional strength, but does not affect the residual, steady-state strength. The behaviors of both types of ultracataclasite are similar.

That significant weakening is only observed at high slip rates and that the critical slip distance for weakening decreases with an increase in normal stress, imply that weakening is a thermally activated process. Moreover, slide-hold-slide tests show rapid strength recovery consistent with transient thermal effects. Current work is directed at correlating microstructures with frictional behavior and identification of weakening processes.

Shallow Seismicity in Stable Continental Regions (SCRs): Implications for Earthquake Hazards

Klose, Christian D. (LDEO) and Leonardo Seeber (LDEO)

A world-wide compilation of strong (Mw 4.5-8.0) and well constrained earthquakes in stable continental regions (SCRs) reveals a bimodal depth distribution with a very shallow upper crustal component. SCR-earthquake ruptures are confined within the upper or the lower thirds of the crust (0-10 km or 20-35 km). Thus, while the mid crust accounts for much of the moment released in active continental regions (ACRs; excluding regions where tectonics has changed crustal thickness), the same depth range in SCRs shows a minimum of seismicity. This remarkable difference has been partly hidden by a tendency to overestimate hypocentral depths in SCRs, where instrumental

144

Page 142: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

coverage tends to be sparse. The upper 5 km of ACR-crust are generally weak and seismically opaque, thus releasing relatively little seismic moment and attenuating seismic energy coming from below. In contrast, the upper SCR-crust is generally strong and many large SCR-earthquakes nucleate close to the surface, with severe implications for hazard. On the other hand, the tendency of SCR-earthquakes to occur in sequences and to rupture downward offers an opportunity for improving hazard estimates; after an earthquake occurs the probability of another earthquake increases dramatically in a source area with no prior seismicity. Such a probability can be quantified from available data. The aforementioned issues become more important, when human activities are more likely to cause significant stress changes in the depth range for nucleation of SCR-earthquakes than for ACR-earthquakes. A world-wide compilation of human triggered earthquakes due to geomechanical pollution confirms the argument of shallow earthquake nucleation in SCRs.

SCEC/UseIT: Getting a Video out of SCEC-VDOKositsky, Aaron (USC)

The UseIT internship program has been a multidisciplinary, collaborative effort, combining the skills of undergraduates from universities across the U.S. to create interactive geological visualization software for use among the scientific community. The latest version of this software has been dubbed SCEC-VDO, and is able to display earthquake data, faults, and other geological phenomena in 3D. This summer, the UseIT program had an overall goal of producing an earthquake monitoring system through added functionality to SCEC-VDO. To further this goal, I developed a full-fledged, standalone rendering engine within the application to produce animated video from a SCEC-VDO session. Earlier versions of SCEC-VDO featured the ability to create a “flyby” by specifying keyframes for the program to move between via interpolation. However, capturing the playback for later presentation or analysis required the use of proprietary screen recording software outside the SCEC-VDO package; additionally, an older computer or a large dataset would cause the animation to slow to a crawl. I was able to mitigate this limitation by moving from a “best effort” rendering process, in which the computer will produce only as many frames of video as it can display in real-time, to a completeness-based rendering process, in which the computer breaks up the animation into a number of planned frames, takes as long as is necessary to produce every frame, then splices the frames together to create a video. More complex scenes will take longer to render initially, but once rendered, the animation will from then on appear smooth. The output video is guaranteed to run in real-time at a broadcast-standard frame rate with no dropped frames introduced during the capture process. With the addition of this in-program rendering engine, the monitoring system may, in future revisions, produce automated flythrough videos of earthquakes as they occur.

Frequency Dependence of Damping Back-Calculated from Earthquake Records

Kottke, Albert, (UT), Takaji Kokusho (Chuo University-Tokyo), I. Suetomi (EDM-NIED), and Ellen Rathje (UT)

At high frequencies, equivalent-linear site response analysis often fails to capture soil amplification,due to overdamping. Typically, soil properties that are used in site response analysis (stiffness,G, and damping, D) are measured in the laboratory where their strain dependence can be established. The laboratory-measured stiffness properties are corrected with field measurements of the small strain Gmax, but no direct measurement of damping is available in the field. However, recorded acceleration-histories from borehole arrays provide a means to evaluate equivalent-linear, in situ damping ratios.

145

Page 143: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

To estimate soil properties from borehole array data, one-dimensional SH wave propagation is assumed between the recordings. The Fourier amplitude spectra from recordings at different depths are used to compute spectral ratios, and the equivalent-linear properties of the site are varied until the theoretical spectral ratios best match the recorded spectral ratios. The theoretical model can be fit to the recorded spectral ratios through a variety of methods. Simple methods like the least squares method, do not take into account the measured soil properties at the site producing questionable back-calculated values. The extended Bayesian method compares the likelihood that the recorded motion is predicted by the model, given the measured soil properties. The approach is powerful because it balances the likelihood of both the prior knowledge and the recorded information (Suetomi 1997).

This study uses the extended Bayesian method to back-calculate the properties of a one-dimensional layered soil system from spectral ratios calculated from borehole array recordings. The variable soil properties and the recorded spectrum are described as normal distributions with a given mean and standard deviation. The back-calculation algorithm adjusts the stiffness and damping terms to maximize the likelihood between the predicted and recorded spectral ratios.

In this study, the back-calculation was performed at the SGK site for both the mainshock and aftershocks of the 1995 Kobe earthquake. Three models of frequency dependence for damping (D=1/(2Q) ) were considered. These models are (1) constant Q (no frequency dependence), (2) Q =a fb, and (3) 1/Q = 1/Qi + 1/(a fb). Models (2) and (3) provide less damping at higher frequencies. The success of the each of the models is assessed using the Akaike Information Criterion (AIC). The AIC balances the goodness of fit of a model with the number of variables in the model to give an unbiased estimate of the success of the model.

The preliminary results demonstrate the importance of the frequency dependence in damping in the prediction of the aftershock spectral ratios. There is significant improvement in the prediction of the aftershocks using model (2). There is similar improvement model (3), but the improvement does not justify the use of more model parameters according to the AIC values. The mainshock at SGK does not show an improvement with increasing frequency dependence; this may be due to the large intensity of the Kobe mainshock and the proximity of the fault to the SGK site. There are a total of four borehole arrays for the Kobe aftershocks and mainshock that will be used to better understand the frequency dependence at the different intensity levels.

On the Random Nature of Earthquake Processes: A Case Study the 2004 Parkfield Earthquake

Lavallée, Daniel (UC Santa Barbara), Susana Custodio (UC Santa Barbara), Pengcheng Liu (UC Santa Barbara), and Ralph J. Archuleta (UC Santa Barbara)

In a series of papers (Lavallée and Archuleta, 2003; 2005 and Lavallée et al., 2005), we have laid the basis for a theory that provides a coherent and unified picture of earthquake variability from its recording in the ground motions to its inference in source models. Based on the superposition of seismic waves and the Central Limit Theorem, this theory stipulates that the random properties of the ground motions and the source for a single earthquake should be both distributed according to a Levy law. Our investigation of the random properties of the source model and peak ground acceleration (PGA) of the 1999 Chi Chi earthquake confirms this theory (see: http://www.scec.org/core/public/showNugget.php?entry=2118). As predicted by the theory, we found that the tails of the probability density functions (PDF) characterizing the slip and the PGA are governed by a parameter, the Levy index, with almost the same values close to 1. The PDF tail controls the frequency at which extreme large events can occur. These events are the large stress drops—or asperities—distributed over the fault surface and the large PGA observed in the ground motion. Our results suggest that the frequency of these events is coupled: the PDF of the PGA is a direct consequence of the PDF of the asperities.

146

Page 144: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

The 2004 Parkfield earthquake is the best-recorded earthquake in history for the density of near-source data. It provides an ideal candidate for evaluating and validating the theory discussed above. For this purpose, we used several source models computed for the Parkfield earthquake by Custodio et al. 2005. All the source models used in this study are based on a method to invert kinematic source parameters developed by Liu and Archuleta (2004). The compiled source models differ by the number and the location of the stations used in the inversion. For each source, we compile the parameters of the stochastic model and compare to the random properties of the PGA. We found that that the tails of the probability density functions (PDF) characterizing the PGA are governed by a parameter, the Levy index with a value close to 1. For several source models, the computed Levy index is in good agreement with this value. Our results suggest that all source models are not equivalent in term of their random properties. The values of the stochastic parameters depend on the location a number of stations used in the inversion. Thus, this study provides the basis to compare, validate and optimize computed source models by comparing the random properties of the source to the random properties of the ground motions.

SCEC/UseIT: Saving and Loading State Information in SCEC-VDOLee, Michael (NCSU)

The UseIT (Undergraduate Studies in Earthquake Information Technology) program is responsible for organizing a group of undergraduate interns to develop earthquake visualization software for researchers. The past two years have been devoted to SCEC-VDO (SCEC Visual Display of Objects), a new earthquake visualization software package aimed to replace an older existing program. This summer in particular has focused on improving the earthquake monitoring capabilities in SCEC-VDO. Part of the push to improve the capabilities of SCEC-VDO required developing of a saving/loading mechanism for the program (i.e. a state vector). Using the groundwork set in the previous summer (Manselle, 2004), a systematic approach of separating the parsing of the actual save file (stored in a XML format) from the actual programmatic saving and loading methods was set up to ensure the most clean and versatile system possible. While some technical challenges were encountered with restoration of GUIs within the program, these problems were rectified by altering the original architecture for greater flexibility. With the state vector near full implementation, this allows users to quickly restore previous settings in order to facilitate efficient usage of the program during monitoring situations.

Fault Interaction within Major Strike-Slip Fault Restraining Bends in Southern California

Legg, Mark (Legg Geophysical), Chris Goldfinger (Oregon State), and Jason Chaytor (Oregon State)

The mostly submarine California Continental Borderland provides exceptional examples of active and ancient restraining (and releasing) bend structures along major strike-slip fault zones. Erosional processes in the deep sea are greatly diminished compared to subaerial regions, thus allowing preservation of persistent oblique fault deformation in the seafloor morphology. Active deposition of turbidites and other marine sediments preserve a high-resolution geological record of the fault zone deformation and regional tectonic evolution. Multibeam swath bathymetry combined with high-resolution seismic reflection profiling provide us with detailed images of the seafloor morphology and shallow crustal geometry of these important strike-slip structures.

The 80-km long, 30 to 40 degree left bend in the submarine San Diego Trough - Catalina fault zone creates a large pop-up structure that emerges to form Santa Catalina Island. This ridge of igneous

147

Page 145: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

and metamorphic basement rocks has steep flanks and a classic "rhomboid" shape. A 7.5-km right step-over in the Catalina fault produces a prominent embayment along the southwest side of the uplift, forming an elevated pull-apart structure. The San Pedro Basin fault splits from the Catalina fault at the southeast end of the bend, shunting some right-slip around the restraining bend much like the Eastern California shear zone shunts some right-slip around the major San Andreas restraining bend of the modern Pacific-North America transform fault system. Also, the San Clemente fault merges with the Catalina fault at the southeast end of the Santa Cruz-Catalina Ridge forming a deep pull-apart basin remarkably similar to the intersection of the San Jacinto and San Andreas faults near Cajon Pass between the San Gabriel and San Bernardino Mountains.

Development of major restraining bends offshore southern California appears to result from reactivation of major transform faults associated with middle Miocene oblique rifting during the evolution of the Pacific-North America plate boundary (Legg and Kamerling, 2004). Seismicity offshore southern California demonstrates the active character of these major right-slip fault systems. Transpression along major restraining bends would tend to lock the fault such that accumulated tectonic strain is released during large earthquakes like the 1857 Fort Tejon earthquake of the southern San Andreas fault. Recent moderate earthquakes (M5.5-6.0) at the two ends of the Santa Catalina fault restraining bend may bound a "seismic gap" or "Mogi donut" surrounding the locked fault segment that may rupture in a large (M7+) earthquake in the near future, or perhaps instead due to afterslip of a recent prehistoric earthquake. The overall length of this major offshore restraining bend and broad area of seafloor uplift further poses a threat of locally-generated tsunami during such large submarine earthquakes.

High Resolution Seismic Reflection Data to Determine Holocene Activity of the Compton Blind-Thrust Fault, Los Angeles Basin, California

Leon, Lorraine A. (USC), James F. Dolan (USC), Thomas Pratt (UW Seattle), and John H. Shaw (Harvard)

The Compton thrust fault is a large blind thrust fault which extends northwest-southeast for 40 km beneath the western edge of the Los Angeles basin. It was originally identified by Shaw and Suppe (1996), using industry seismic reflection profiles and well data. The seismic reflection data define a growth fault-bend fold associated with the thrust ramp, which combined with well data, reveal compelling evidence for its Pliocene and Pleistocene activity. The industry data, however, do not image deformation in the uppermost few hundred meters, the area of most interest for determining the recent seismic history of the fault. In order to bridge this gap, we acquired high resolution seismic reflection profiles on the back limb active axial surface of the fault-bend fold above the Compton Fault thrust ramp. Our study site is located in South-Central Los Angeles on Stanford Street, east of the 110 freeway and ~ 6 km south of downtown Los Angeles. A 1.1-km-long, high-resolution seismic reflection profile was acquired using a truck-mounted weight-drop source to delineate the axial surfaces of the fold from the upper 10’s of m’s downward to overlap with the upper part of the industry reflection data, within the upper 500 m. We also acquired a ~ 700-m-long, higher-resolution sledgehammer seismic profile designed to image the upper 50-100 m’s. The reflection data are currently being processed and we hope to show preliminary versions of these profiles at the SCEC meeting. These high-resolution data will allow us to accurately and efficiently site a fault-perpendicular transect of boreholes. These borehole data will (in turn) allow us to document the recent folding history during Compton blind-thrust earthquakes.

148

Page 146: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Spatial and Temporal Length Scales Characterizing the Evolution of Seismicity Rates

Levin, Shoshana (UWO) and Kristy Tiampo (UWO)

Numerous studies have documented systematic changes in seismicity rates preceding large magnitude events. We use two approaches to examine the spatial and temporal scales characterizing these rate changes. The first method considers changes in yearly seismicity rates as a function of distance from the rupture plane of major historical events. To quantify the significance of trends in the seismicity rates, we auto-correlate the data, using a range of spatial and temporal lags. Here, we focus on the results for the 1992 Landers, CA, earthquake and the 2002 Denali, AK, earthquake. We also briefly address the results for the 1971 San Fernando, 1983 Coalinga, 1986 Chalfant Valley, 1987 Superstition Hills, 1989 Loma Prieta, 1994 Northridge and 1999 Hector Mine events.

Our second approach follows the methodology outlined in Tiampo et al. [2002], for determining the eigenfunctions describing spatial and temporal correlation in regional seismicity. We extend the analysis by incorporating a temporal lag in construction of the covariance matrix. Decomposing the matrix into its eigenmodes then highlights correlated activity separated in time by the specified lag. Here, we present the results obtained for southern California seismicity from 1932 to 2004, using a range of temporal lags.

High-Resolution Imaging of the Deep Structure of the Bear Valley Section of the San Andreas Fault with Joint Analysis of

Fault-Zone Head Waves and Direct P ArrivalsLewis, Michael (USC), Yehuda Ben-Zion (USC), and Jeff McGuire(WHOI)

Understanding the structure of large faults is an important step towards the understanding of earthquake processes on those faults. The short length scales that are important for earthquake physics can not be resolved at depth by conventional geophysical methods. The utilization of seismic energy trapped within low velocity fault zone layers can yield detailed images of the fault structure. However, recent studies at a number of locations have indicated that trapped waves are typically generated only by the top ~3km of the fault zones, above the seismogenic portion of the structures. Since major faults typically juxtapose rocks with different elastic properties, the contrast in materials can lead to the generation of fault zone head waves that spend the majority of their propagation paths refracting along the fault interface. The incorporation of fault zone head wave in imaging studies can thus resolve important small scale elements of the fault zone structure at seismogenic depths.

In this study we perform a joint direct P and head wave travel time inversion to produce a separate 1D velocity model for each side of the San Andreas fault in the Bear Valley region. The data comes from a dense temporary array of seismometers deployed by Thurber et al. (1997) and the permanent northern California seismic network stations in the area. We have picked arrival times from 450 events at up to 54 stations, resulting in over 9800 direct and over 2700 head, P wave arrival times. One set of inversions is preformed upon the whole data set, and 5 inversion sets are done on various data subsets to try to understand details of the velocity structure. The results imply a strong velocity contrast of ~50% in the near surface that reduces rapidly to 10-20% below 3 km. The presence of a shallow damage zone around the fault is detected by inversions using subsets of the data made up of only stations close to the fault. The faster (southwest) side of the fault shows the development of a low velocity layer at the surface as instruments closer to the fault (<5km and <2km) are used; such a feature is not present on inversions using only stations at greater distances from the fault. On the slower (northeast) side of the fault the presence of low velocity shallow layer is only detected in the inversions using the stations within 2km of the fault. The asymmetry of the shallow low velocity layer may reflect a preferred propagation direction of earthquake ruptures. Using events from different

149

Page 147: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

portions of the fault, the head wave inversions also resolve small scale features of the fault visible in the surface geology and relocated seismicity.

Seismic Evidence for Rock Damage and Healing on the San Andreas Fault Associated with the 2004 M6 Parkfield Earthquake

Li, Yong-Gang, Po Chen (USC), Elizabeth Cochran, John Vidale (UCLA), and Thomas Burdette (USGS)

We deployed dense linear seismic arrays of 45 seismometers across and along the San Andreas fault near Parkfield immediately after the M6 Parkfield earthquake on September 28, 2004 to record fault-zone seismic waves generated by near-surface explosions and earthquakes. Seismic stations and explosions were co-sited in our previous experiment in the fall of 2002. The data from repeated shots in the two surveys show ~1.0-1.5% decreases in seismic velocity within the ~200-m-wide zone along the fault trace and less changes (0.2-0.5%) beyond this zone, most likely due to the coseismic damage of rocks during dynamic rupture of the 2004 M6 earthquake. The width of this zone characterized by greater velocity changes is consistent with the low-velocity waveguide model on the San Andreas fault, Parkfield we derived from fault-zone trapped waves (Li et al., 2004). The damage zone is not symmetric with the main fault trace but broader on the southwest side of the fault. Waveform cross-correlations for repeated aftershocks in 21 clusters, total ~130 events, located at different depths and distances from the array site show ~0.7-1.1% increases in S-wave velocity within the fault zone in 3 months starting a week after the mainshock, indicating that the damaged rock has been healing and regaining the strength with time, most likely due to the closure of cracks opened in the mainshock. The healing rate was greater in the earlier stage of post-mainshock healing. We estimate the velocities within the fault zone decreased by at least ~2.5%, most likely associated in the 2004 M6 event at Parkfield. The magnitude of fault healing is not uniform along the ruptured segment; it is slightly larger beneath Middle Mountain in accordance with larger slip mapped there. The fault healing is seen at depth above ~6-7 km and is likely to be depth-dependent, too. The damage and healing progression observed on the SAF associated with the 2004 M6 Parkfield earthquake are consistent with our previous observations at rupture zones of the 1992 M7.4 Landers and 1999 M7.1 Hector Mine earthquakes. However, the fault-zone damage degree is smaller and the healing process is shorter on the SAF at Parkfield than those on at Landers and Hector Mine.

Low-Velocity Structure of the San Andreas Fault near the SAFOD Drilling Site at Parkfield from Fault-Zone Trapped Waves

Li, Yong-Gang (USC), John Vidale, Elizabeth Cochran (UCLA), Peter Malin (Duke), and Po Chen (USC)

COMPLOC_1.0 is a set of Fortran77 programs for locating local earthquakes that implements the source-specific station term (SSST) method for improving relative location accuracy among nearby events. The programs in this package have been tested on data from the Southern California Seismic Network (SCSN) and Northern California Earthquake Data Center (NCEDC) on both MAC and SUN systems. Locations can be obtained using single event, static station terms, simple SSST and shrinking box SSST methods by assigning different values to location parameters. The main location program uses P and S travel-time tables, generated by the user from a 1-D velocity model. A station location file is required, which in California can be obtained from the station lists on the SCSN and NCEDC websites. The station file may also include starting station terms (if available). Phase pick data are read from STP or HYPOINVERSE formats. Starting locations for the events in the phase pick file must be available; comploc is not designed to locate events without any prior location estimates. A grid-search approach and robust misfit norms make the method relatively insensitive to the effects of gross picking or timing errors. We demonstrate with several examples how this package

150

Page 148: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

can be used to relocate earthquakes. The current version of the software computes locations using phase pick data alone; future version of the codes will also include precise differential time constraints from waveform cross-correlation. A beta test version of COMPLOC_1.0 is available at: http://igpphome.ucsd.edu/~glin/.

Calculation of Broadband Time Histories: Validation Using Data Sets from M6.7 Northridge Earthquake

Liu, Pengcheng (UCSB), Stephen Hartzell (USGS), and Ralph Archuleta (UCSB)

We have developed a technique for kinematic modeling of an extended earthquake source that is based on distribution functions for the slip amplitude, duration of slip (risetime) and rupture velocity. The complexity of the source process is represented by spatial distributions of randomized source parameters, but the integrated characteristics of these parameters will be constrained by the total moment (magnitude), radiated energy and the high-frequency decay of the spectral amplitudes in the source spectrum. Dynamic modeling of complex rupture process (e.g. Oglesby and Day, 2002; Guatteri, et al. 2003) shows that the areas of large slip correlate with high stress drop, fast rupture velocity and short rise time. But the correlation between rise-time and slip is not so strong as the correlation between rupture velocity and slip (Oglesby and Day, 2002). Based these studies, we assume that the correlation between rupture velocity and slip is about 60% and the correlation between rise-time and slip is –30%.Having the kinematic source model, we use one-dimensional approximations to the velocity structure and FK method to calculate high-frequency (greater than 1 Hz) ground motions; we use a three-dimensional Earth model to calculate synthetic ground motions for frequencies up to one to two Hertz. The 3D model incorporates the geometry of the geology in the area, including the deep basin structures. The high- and low-frequency synthetic ground motions are stitched together to form a broadband time histories of ground motions. To consider nonlinear soil effects, we first deconvolve the synthetic time history to the bedrock level using the geotechnical information. Then this bedrock time history is propagated to the surface using a 1D nonlinear wave propagation code (e.g., Bonilla et al., 1998).

This method for calculating broadband ground motions is evaluated using data sets from 1994 M6.7 Northridge earthquake. We calculate the average misfit and standard error of Fourier amplitude spectra and peak acceleration between the synthetics and observed data.

Aseismic Slip Transients Induced by In-Slab Extensional Events in Subduction Zones and Effects on Future Thrust Earthquakes --

Numerical Study by a 2D Rate and State ModelLiu, Yajing (Harvard) and James R. Rice (Harvard)

To investigate the possible signature of aseismic slip transients in stress transfer, and its effect on seismic hazards, in major subduction zones, we first studied the spatial-temporal seismicity variations for appropriately chosen time spans that bracket detected transients in the Mexico (Guerrero), Cascadia and Japan (Bungo Channel) subduction zones. We found a pattern suggesting that the aseismic transient acted as a means of communication between an extensional seismicity cluster downdip in the slab, and later thrust seismicity in the shallow seismogenic zone, for the large 2001-2002 Guerrero transient.

We then performed a numerical investigation, with a 2D model of subduction earthquake sequences in the framework of the Dieterich-Ruina version of rate and state friction. Friction properties are temperature, and hence depth-dependent, by incorporating appropriate thermal profiles of the

151

Page 149: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

subduction interface. Without stress perturbation from extensional events, periodic large thrust earthquakes occur under assumed 2D modeling parameters. We represent the effects of an in-slab extensional event by imposing on the subduction fault, within the interseismic period, the static stress changes due to such normal-fault slips. We found that three factors of introducing the stress perturbation, namely, (1) the stage in the interseismic period, (2) the location along the subduction fault, and (3) the perturbation magnitude, greatly affect the induced transients and future large thrust earthquakes. Generally, sequential transients appear before the next thrust earthquake if the perturbation is introduced in the early stage of an interseismic period. Sequential transients tend to migrate toward the velocity-weakening to strengthening transition zone even the perturbation is first introduced much further downdip of that region. The slip rate during the transients increases as the magnitude of the stress perturbation increases.

Detailed simulations will be run to quantify the stress distribution caused by these transients, how the sequential transients are excited and their effects in advancing or delaying future seismic events.

Three-Dimensional Simulations of Spontaneous Earthquake Sequences Liu, Yi (Caltech) and Nadia Lapusta (Caltech)

Simulations of spontaneous earthquakes in three-dimensional (3D) models enjoy a lot of interest, because of their ability to clarify earthquake physics. We are developing a procedure for simulating sequences of dynamic events on a fault subjected to rate and state friction and slow tectonic loading. The algorithm, extended from the 2D study by Lapusta et al. (2000), allows us to resolve all stages of every earthquake in a single computational procedure, including quasi-static nucleation process, dynamic rupture propagation, post-seismic deformation, and slow creeping slippage throughout the loading period. Simulating earthquake sequences is quite challenging even in 2D models due to a variety of temporal and spatial scales. Switching to 3D requires overcoming new theoretical and computational challenges.

We have just started our simulations, considering first a buried seismogenic region with steady-state velocity-weakening properties surrounded by a steady-state velocity-strengthening region that stably slips (creeps) under loading. The seismogenic region is 30 km long and 15 km deep. We find that, for large characteristic distances of the rate and state friction, the model produces large, model-spanning quasi-periodic earthquakes. We also find that changing the model in the direction of decreasing inertial effects slows down the rupture propagation, decreases the peak slip velocities, and reduces the recurrence period of the earthquakes. These results are consistent with the previous 2D studies. In the future, we plan to further develop the methodology, to enable 3D studies of more realistic parameters and to allow for the free surface in the model. We will use the developed methodology to study earthquake nucleation, dynamic propagation, post-seismic slip, and their interaction, concentrating first on small event clustering on rheological transitions and interaction of earthquake sequences with fault heterogeneities.

Detecting and Imaging Aseismic Transient Deformation in Southern California

Liu, Zhen and Paul Segall (Stanford)

Large-scale continuous GPS networks have documented a wide-range of aseismic deformation transients resulting from various physical processes, including aseismic fault slip and magma intrusion. Recently, a variety of transient slip events have been discovered in subduction zone environments. Previously, transient slip episodes were identified along the San Andreas Fault near transitions from creeping to locked segments of the fault. The Southern California Integrated GPS

152

Page 150: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Network (SCIGN) provides the possibility to explore for these and other aseismic deformation transients in a major strike-slip plate boundary setting. We are applying the Extended Network Inversion Filter (ENIF) of (McGuire and Segall, Geophys. J. Int, 2003), a time-dependent inversion algorithm based on an extended Kalman filter to search for possible aseismic transient slip signals in

Southern California. The ENIF is able to separate signals from various processes and image spatial-temporal evolution of the slow-slip events in detail.

We extended the method to include triangular-mesh surfaces by introducing triangular dislocations and spatial smoothing on triangular fault surface. The new filter version (named TNIF) enables us to incorporate complex 3-dimensional fault geometry. We construct a fault model using SCEC Community Fault Model (CFM). Slip rates from USGS/CGS2002 fault database and geometry from the SCEC Community Fault Model are combined. To regularize the surface mesh while reducing the number of inversion parameters, we remeshed all faults of the CFM with slip rate >= 2.0mm/yr onto coarser grids by applying point/boundary constraints and isotropic DSI interpolation. Raw SINEX solutions from Scripps Orbit and Permanent Array Center (SOPAC) are used in a differential sense in the inversion. The time series are cleaned by removing offsets and outliers identified by SOPAC. To date we have applied the TNIF to the area 30km from the fault San Andreas Fault (74 sites), using data from January 2001 to June 2005. Postseismic deformation following Sep. 28, 2004 Mw6.0 Parkfield Earthquake is easily identified and reasonably well modeled. The filter identifies other areas of transiently high slip rate, however these regions are dominated by a few sites and require further study. We are currently in the process of including more faults and GPS stations into our inversion filter and exploring transient signals on other faults of southern California.

GPS Data Collection in the San Bernardino MountainsLopez, Amanda (SCEC/SURE Intern, CSUSB), Sally McGill (CSUSB),

Joan Fryxell (CSUSB), Adam Skalenakis (SCEC/SURE Intern, Harvey Mudd College), and Greg Lyzenga (Harvey Mudd College)

The objective of our project was to collect GPS data from the areas that surround the San Andreas and San Jacinto faults so that we could update the velocity vectors on a map of the area. Another major part of the data collection was to fill gaps in the San Bernardino mountain range by occupying stations that did not have previous data, starting a new data stream for future years.

From June 21 to July 1, 2005 SCEC intern Adam Skalenakis and I participated in a two-week NSF-funded GPS campaign collecting data from twelve stations across the San Andreas and San Jacinto faults. We spent this time learning how to set up GPS equipment, collecting data and assisting 17 high school teachers, 13 undergraduates and 12 high school students with the same process. At each site for three to four days, eight hours of data was collected. After the campaign was over Adam and I went out to occupy eleven more stations in the San Bernardino Mountains in order to fill in gaps in SCEC’s Crustal Motion Model 3 (CMM3). At least 24 hours of continuous data was collected from each of these sites.

After the data collection, my part of the project was to process the GPS data using Auto-GIPSY. Auto-GIPSY is a web-service supported by the Jet Propulsion Laboratory (JPL) that retrieves Rinex files from a server and processes them using JPL’s GIPSY software, which gave me the position of the station in terms of deviation from the nominal coordinates that were in the Rinex file. I then plotted the position as a function of time for each site and updated the velocity estimate for each site. I came across some complications when processing through Auto-GIPSY, but this was easily fixed by formatting the rinex file using TEQC, a program supplied by UNAVCO. I also went back to previous years and reprocessed data that did not process correctly the first time.

After processing the data from the two-week campaign I processed and reprocessed about 70 files all

153

Page 151: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

together. From a total 221 data files that have been collected by the CSUSB-Harvey Mudd team since 2002, 82% have now been processed successfully. After three years of data collection the velocities are still somewhat uncertain. In particular 6106 has a velocity of about 11 cm/yr relative to North America. This is faster than the Pacific plate velocity of about 5 cm/yr and is thus unreasonable. The velocities of the remaining eleven stations seem more reasonable. Most of them are moving in a northwestern direction, 1.1 to 4.8 cm/yr, with the velocities generally increasing going southwestward on the Pacific plate. This is what would be expected for elastic strain accumulation along the San Andreas and San Jacinto faults. One-dimensional elastic modeling of our data suggests that the deformation is concentrated on and eastward of the San Andreas fault with relatively little deformation on the San Jacinto fault. However, because of the uncertainty of the velocities the results of the modeling are highly preliminary.

WaDS: A New Approach to Operate on Massive Wavefield DatasetsLopez, Julio (CMU), Leonardo Ramirez-Guzmán (CMU), Tiankai Tu (CMU),

Jacobo Bielak (CMU), and David O'Hallaron (CMU)

Ground motion datasets are becoming increasingly large due to improvements in simulation techniques and advances in computer systems. Operating on these datasets becomes extremely challenging as they do not fit in memory all at once. We are currently building a Wavefield Database System or WaDS. WaDs is a new approach to operate on massive wavefield datasets, such as, the ones found in ground motion simulation. For example, simulations using CMU's hercules solver produce multi-terabyte 4D wavefield datasets, which need to be post-processed in order to extract useful information and produce derived products. Common operations include tranposition, transformation between time and frequency domains, filtering, subsetting, subsampling, searching for minimum, maximum, etc. Other post-processing tools, such as, visualization programs require alternate derived data representations, requiring transformations of the solver's output.

Storing and transforming these datasets becomes more difficult as their size increases, putting them at risk of becoming "write-once read-never". WaDS provides simple, yet powerful, high-level abstractions to operate on wavefield datasets. WaDs abstractions are independent from low-level dataset representations. Clever computer systems techniques in WaDS' implementation provide good performance for various operations by making efficient use of system resources such as IO and memory bandwidth, and avoiding slowdown caused by large access latency. Decoupling between wavefield abstraction and data representation allows WaDS to support multiple representations and indexing mechanism that are optimized for different purposes: storage size, access latency, access throughput, etc. There is no need to convert solver output in order to load it into the database. WaDS readily supports commonly used solver output formats such as hercules/etree and KBO formats. WaDS is extensible and supports other formats through plugins that operate on particular data representations. Dataset indexing mechanisms allow WaDS to improve performance in highly-selective query operations. WaDS automatically updates metadata to keep track of dataset's origin, properties and applied operations. A key feature in WaDS is its support for wavefield compression mechanisms, reducing storage and bandwidth requirements. Preliminary results show 3:1 compression ratios for CMU's Hercules octree-based wavefields; and approximately 10:1 compression ratios for 4D regular-grid wavefields and query performance comparable to that of uncompressed datasets for various workloads.

154

Page 152: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Radiated Seismic Energy From the 1994 Northridge Earthquake Determined from a Spontaneous Rupture Model

Ma, Shuo and Ralph Archuleta (UC Santa Barbara)

The radiated seismic energy of the 1994 Northridge earthquake (M 6.7) is calculated by simulating a spontaneous rupture model of the earthquake in a layered velocity structure. The stress drop distribution is determined from the inverted slip distribution (Hartzell et al., 1996) using a static finite element method. From the stress drop distribution we derive a spatially heterogeneous initial stress and yield stress. In concert with a slip-weakening friction law we dynamically rupture the fault using a 3D finite-element method. We used a constant critical slip-weakening distance 0.25 m everywhere on the fault. Using trial and error, we modified both the initial stress field and yield stress until a dynamic rupture generated a rupture history and final slip distribution that approximately matched those determined by kinematic inversion. The resulting dynamic model provides a reasonably good fit between synthetics and near-field strong motion data. The total radiated seismic energy calculated from our dynamic model is 6.0 x 1014 J and the apparent stress is 1.5 MPa. This estimate is close to the Gutenberg-Richter estimate of 7.1 x 1014 J and 6.5 x 1014J (Meyeda and Walter, 1996), roughly two times larger than the NEIC teleseismic estimate (3.1 x 1014 J), but less than the estimate 1.3 x1015

J (Kanamori and Heaton, 2000) and 1.2 x 1015 J (McGarr and Fletcher, 2002). By looking at the energy flux distribution on a 30 km radius hemisphere enclosing the source, strong directivity effects are evident: three cones of energy are concentrated in the forward direction of rupture propagation. Our dynamic model also determines the fracture energy for the Northridge earthquake 3.2 x 1014 J, which is comparable to the radiated energy and cannot be ignored in the balance of all the energies.

Topographic Effects of Ground Motions – Simulation of the 1994 Northridge Earthquake

Ma, Shuo, Ralph Archuleta, and Pengcheng Liu (UC Santa Barbara)

Topographic effects on ground motions are studied through explicit finite element simulations of the 1994 Northridge Earthquake. Free surface topography has long been known to have a large effect on the recorded ground motions. However, the effects have not been well studied numerically because of the difficulties caused by the complicated boundary conditions of free surface topography for some numerical methods, e.g. finite-difference method. To incorporate topography in ground motion calculations we have developed a finite element method that uses hexahedral elements and an optimal hourglass control scheme. To examine the effects of topography we compute ground motions with and without topography. We have incorporated Hartzell et al. (1996) kinematic source model of Northridge, SCEC 3D velocity structure of San Fernando area and true topography. The minimum shear wave velocity in our calculations is 600m/s; elastic waves are calculated accurately up to 0.5 Hz. We found surface topography has a large effect on ground motions. Horizontal and vertical ground motions show different characteristics. Horizontal ground motions tend to be amplified everywhere on elevated regions with peaks having the largest amplification, whereas vertical motions tend to be de-amplified by topographic peaks but amplified by mountain flanks. Part of sedimentary basins and valleys are shown not be amplified or de-amplified by the topography. In the frequency band of our simulation (0-0.5Hz) the body waves are barely affected by topography; most of the effect is in the surface waves.

155

Page 153: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

SCEC Community Modeling Environment (SCEC/CME) – Seismic Hazard Analysis Applications and Infrastructure

Maechling, Philip (USC), T.H. Jordan (USC), Carl Kesselman (USC/ISI), Reagan Moore (SDSC), Bernard Minster (UCSD), and

the SCEC ITR Collaboration

The SCEC/CME project is a Geosciences/IT partnership that is actively developing an advanced information infrastructure for system-level earthquake science in Southern California. This partnership includes SCEC, USC's Information Sciences Institute (ISI), the San Diego Supercomputer Center (SDSC), the Incorporated Institutions for Research in Seismology (IRIS), and the U.S. Geological Survey. The goal of the SCEC/CME is to develop seismological applications and information technology (IT) infrastructure to support the development of Seismic Hazard Analysis (SHA) programs and other geophysical simulations.

The SHA application programs developed by project collaborators include a Probabilistic Seismic Hazard Analysis system called OpenSHA. OpenSHA computational elements that are currently available include a collection of attenuation relationships, and several Earthquake Rupture Forecasts (ERF’s). Geophysicists in the collaboration have also developed Anelastic Wave Models (AWMs) using both finite-difference and finite-element approaches. Earthquake simulations using these codes have been run for a variety of earthquake sources. Rupture Dynamic Model (RDM) codes have also been developed that simulate friction-based fault slip.

The SCEC/CME collaboration has also developed IT software and hardware infrastructure to support the development, execution, and analysis of these SHA programs. To support computationally expensive simulations, we have constructed a grid-based scientific workflow system. Using the SCEC grid, project collaborators can submit computations from the SCEC/CME servers to High Performance Computers at USC and TeraGrid High Performance Computing Centers.

Data generated and archived by the SCEC/CME is stored in a digital library system, the Storage Resource Broker (SRB). This system provides a robust and secure system for maintaining the association between the data seta and their metadata.

To provide an easy to use system for constructing SHA computations, a browser-based computational pathway assembly web site has been developed. Users can compose SHA calculations, specifying SCEC/CME data sets as inputs to calculations, and calling SCEC/CME computational programs to process the data and the output. By assembling a series of computational steps, users can develop complex computational pathways the validity of which can be assured with an ontology-based pathway assembly tool.

Data visualization software developed by the collaboration supports analysis and validation of data sets. Several programs have been developed to visualize SCEC/CME data including GMT-based map making software for PSHA codes, 4D wavefield propagation visualization software based on OpenGL, and 3D Geowall-based visualization of earthquakes, faults, and seismic wave propagation.

The SCEC/CME Project also helps to sponsor the SCEC UseIT Intern program. The UseIT Intern Program provides research opportunities in both Geosciences and Information Technology to undergraduate students in a variety of fields. The UseIT group has developed a very capable data visualization tool, called SCEC-VDO, as a part of this undergraduate research program.

156

Page 154: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

SCEC/CME CyberShake: Calculating Probabilistic Seismic Hazard Curves Using 3D Seismic Waveform Modeling

Maechling, Philip (USC), Scott Callaghan (USC), Yifeng Cui (SDSC), Mario Faerman (SDSC), Edward Field (USGS), Robert Graves (URS),

Nitin Gupta (USC), Vipin Gupta (USC), Thomas H. Jordan (USC), Carl Kesselman (USC), John Mehringer (USC), Gaurang Mehta (USC),

David Okaya (USC), Karan Vahi (USC), and Li Zhao (USC)

Researchers on the SCEC Community Modeling Environment (SCEC/CME) Project are calculating Probabilistic Seismic Hazard Curves for several sites in the Los Angeles area. The hazard curves calculated in this study use Intensity Measure Relationships (IMRs) based on 3D ground motion simulations rather than on attenuation relationships.

State-of-the-art Probabilistic Seismic Hazard Analysis (PSHA) is currently conducted using IMRs that are empirically-based attenuation relationships. These attenuation relationships represent relatively simple analytical models based on the regression of observed data. However, it is widely believed that significant improvements in SHA will rely on the use of more physics-based, waveform modeling (e.g., Field et al. 2000, the SCEC Phase III overview paper). In fact, a more physics-based approach to PSHA was endorsed in a recent assessment of earthquake science by National Research Council (2003).

In order to introduce the use of 3D seismic waveform modeling into PSHA hazard curve calculations, the SCEC/CME CyberShake group is integrating state-of-the-art PSHA software tools (OpenSHA), SCEC-developed geophysical models (SCEC CVM3.0), validated anelastic wave modeling (AWM) software, and state-of-the-art computational technologies including high performance computing and grid-based scientific workflows.

To calculate a probabilistic hazard curve for a site of interest (e.g. USC), we use the OpenSHA implementation of the National Seismic Hazard Mapping Projects's 2002 earthquake rupture forecast for California (NSHMP-2002-ERF) and identify all ruptures within 200km of the site of interest. For each of these ruptures, we convert the NSHMP-2002-ERF rupture definition into one, or more, Ruptures with Slip Time History (Rupture Variations) using a Rupture Generator (Graves et al, this meeting). Strain Green Tensors are calculated for the site using the SCEC CVM3.0 3D velocity model. Then, using a reciprocity-based approach, we calculate synthetic seismograms for each Rupture Variation. The resulting suite of synthetics is processed to extract peak intensity measures of interest (such as spectral acceleration). The peak intensity measures are combined with the original rupture probabilities to produce probabilistic seismic hazard curves for the site.

The CyberShake calculations are performed on high performance computing systems including multiple TeraGrid sites (currently SDSC and NCSA), and at USC’s High Performance Computing and Communications (HPCC) center. The CyberShake job submission and data management uses a grid-based scientific workflow system developed on the SCEC/CME Project. The computation time and data storage required for the CyberShake calculations are made possible through a TeraGrid allocation of more than 145,000 SU’s and 70TB of disk storage.

An important goal of the CyberShake effort is to develop an OpenSHA-compatible 3D waveform-based IMR component that is fully integrated with the other components in OpenSHA. This will allow researchers to combine a new class of waveform-based IMRs with the large number of existing PSHA components, such as ERF's, that are currently implemented in the OpenSHA system.

Version 4 of the CVMMagistrale, Harold (SDSU)

157

Page 155: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

I present Version 4 of the CVM featuring the following enhancements: a new San Bernardino Valley representation, a new Salton Trough model, and a new Vp-density relation.

San Bernardino Valley: A new San Bernardino Valley basement is based on recent USGS inversion of gravity data confirmed by comparison to a seismic reflection line. The new model features a deep trough in the central valley, in contrast to the previous flat-bottomed valley model. The new basement joins smoothly to the relatively shallow Chino basin to the west.

Salton Trough: A new model is motivated by the needs of TeraShake simulations of southern San Andreas fault events. Depth to basement is defined by a combination of seismic refraction surveys, inversion of gravity observations, surface geology, and boreholes. Sediment velocity-depth gradients depend on the nature of the basement, smoothing merging into deep metasedimentary basement, and having a discontinuity above shallow crystalline basement. The model includes the portion of the Trough south of the international border.

Vp-density: The new Vp-density relation is based on density measurements from oil well samples in the Los Angeles basin and the San Gabriel Valley, geotechnical boreholes throughout southern California, and 12 oil wells along the LARSE lines. The latter data have some coincident density and Vp measurements, while the other data sources contain density information without coincident Vp. To relate these later density data to Vp, I calculated the CVM Vp at the density measurement sites, and compared those measured density-synthetic Vp pairs to the measured density and Vp data. The calculated Vp values were consistent with the measured Vp (as could be inferred from the general fit of numerous synthetic waveforms calculated in the CVM to observations), so I could use the measured density-synthetic Vp pairs in determining the new relation. I experimented with breaking out the Vp-density data

by stratigraphic affiliation and age but found little statistical advantage in such breakouts, and so lumped all the data together.

Surprisingly, the newly determined Vp-density ratio is constant, in contrast to the old relation. This is true even for low Vp, as defined by the geotechnical data. The new densities are higher, for a given Vp, than the old. This will tend to lower the Poisson ratio, which will lower Vp/Vs; that is, changing the Vp-density relation produces a new Vs model.

Broadband Ground Motion Simulation: Combining Finite-Difference Synthetics With Local Scattering Operators

Mai, Martin (ETH, Zurich, Switzerland) and Kim B. Olsen (SDSU)

We explore a new approach for generating broadband synthetic seismograms for large (extended-fault) earthquakes. The method combines low-frequency (LF) finite-difference synthetics with high-frequency (HF) scattering operators. The method uses spatially-variable rupture parameters, such as slip, rupture velocity, and shape of the source-time functions for the computation of near-source ground motions in possibly complex three-dimensional Earth models. In the first step we calculate the LF-finite-difference seismograms for the chosen earthquake source model at the observer locations of interest. We then compute a site-specific scattering Green’s function for a random, isotropic scattering medium (Zeng at al., 1991), with scattering parameters (scattering coefficient, absorption coefficient) adapted from published estimates. The scattering Green’s functions are convolved with the moment-rate function to form local scattering operators (‘scatterograms’). LF-finite-difference synthetics and HF-scatterograms are then combined in the frequency domain using the methodology by Mai and Beroza (2003) which enables a match both in both amplitude and phase at the intersecting frequency for LF and HF synthetics.

We apply our novel methodology to near-source ground-motion simulations for the 1994 Northridge

158

Page 156: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

earthquake, focusing on 30 selected sites (both rock and soil sites) for which high quality recordings exist. We evaluate the need for applying site-dependent amplification factors accounting for the effects of shallow layers of low shear-wave velocity (Borcherdt, 1994, 2002). The model calculations are compared against the recordings in terms of residuals and model bias of spectral acceleration for the ensemble of ground-motions, rather than trying to match a single recording.

Holocene Slip Rate of the San Andreas Fault at Plunge Creek, Highland, California

McGill, Sally (CSU San Bernardino), Ray Weldon (Univ. Oregon), Lewis Owen (Univ. Cincinatti), and Amanda Lopez (CSU San Bernardino)

In an effort to measure the Holocene slip rate of the San Andreas fault, we have continued field mapping and have conducted trenching at an offset channel in the East Highlands Ranch area of Highland. An abandoned channel of Plunge Creek is preserved on the southwestern (downstream) side of the fault. It is a remnant from a time when the channel flowed parallel to the fault in order to connect two channel segments that had been offset by slip on the San Andreas fault. The fault-parallel channel segment was abandoned when the channel re-incised straight across the fault. Since that time, the newly incised channel wall has also been offset by the San Andreas fault. On the northeast (upstream) side of the fault there are two major gravel-fill terraces preserved within the canyon of Plunge Creek. The channel wall that truncates the fault-parallel segment of the abandoned channel most likely correlates with the riser between the high and low terraces northeast of the fault. A preliminary estimate of the right-lateral offset of this feature is about 307 meters.

Several trenches on the high and low terraces were excavated in November and December of 2004 for the purpose of collecting dateable material to constrain the age of the offset terrace riser. Fourteen charcoal samples have been submitted for radiocarbon dating and fifteen samples were collected for optically stimulated luminescence dating were collected. Samples for cosmogenic nuclide surface exposure dating were also collected from boulders on the high and low terraces. The pending dates and a more refined measurement of the offset will allow us to estimate the Holocene slip rate along this part of the San Andreas fault.

The San Andreas and San Jacinto faults are the dominant faults within the plate boundary fault system in southern California. Together they accommodate about 70% of the Pacific-North America plate motion. However, there has been considerable debate as to whether the San Andreas fault contributes substantially more to the plate boundary deformation in southern California than the San Jacinto fault, whether the two faults contribute approximately equally or even whether the San Jacinto fault contributes more. The Holocene slip rate of the San Andreas fault, in Cajon Pass is well-documented at 24.5 +/- 3.5 mm/yr (Weldon and Sieh, 1985). However, some investigators suggest that the San Andreas fault slip rate decreases southeastward along the San Bernardino strand, as more and more slip is accommodated by the San Jacinto fault (Matti and others, 1992; Morton and Matti, 1993). Our work on measuring the Holocene slip rate of the San Andreas fault in Highland is aimed at testing this hypothesis.

159

Page 157: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Earthquake Triggering in the Gulf of California/Salton Trough RegionMcGuire, Jeff (WHOI)

The transition from a transform to a spreading plate boundary in southernmost California, northern Mexico, and the Gulf of California also corresponds to a distinct transition in earthquake clustering behavior. In this area, the earth’s crust changes from about 25 km thick continental material to 6 km thick oceanic crust, and the heat flow is extremely high throughout. The seismicity shows two distinct features that are characteristic of oceanic regions but here occur within the continental crust. First, the aftershock sequences following large earthquakes are relatively subdued. Secondly, there are often swarms of earthquakes, on both normal and strike-slip faults, that last from a few hours to a few days. Both types of clustering behavior can be quantified as deviations from seismic scaling laws. Moreover these deviations can be utilized as test of rheological models which are based on the interplay between brittle and ductile rheological lawsUnfortunately, the quality of earthquake catalogs in this region, particularly offshore, is relatively poor. Many magnitude 5.5 and smaller earthquakes in the Gulf of California routinely go uncataloged. Recently, seismic waveform data has become available for this region that is sufficient for developing earthquake catalogs complete down to about magnitude 3. We are beginning to utilize this waveform data to construct the catalogs necessary for documenting the spatial transition from continental to oceanic earthquake clustering behavior. We hope to determine how sharp this transition is and whether it relates primarily to geological or thermal structure. Initial results demonstrate that the transition in aftershock depletion as quantified by Bath’s law, is not spatially localized or correlated with crustal thickness.

Spatial Localization of Moment Deficits in Southern CaliforniaMeade, Brendan (Harvard) and Bradford Hager (MIT)

The balance between interseismic elastic strain accumulation and coseismic release defines the extent to which a fault system exhibits a surplus or deficit of large earthquakes. We calculate the regional moment accumulation rate in southern California based on a fault slip rate catalog estimated from a block model of interseismic deformation constrained by GPS measurements. The scalar moment accumulation rate, 17.8+/-1.1 e18 Nm/yr, is approximately 50% larger than the average moment release rate over the last 200 years. Differences between the accumulated and released elastic displacement fields are consistent with moment deficits that are localized in three regions: the southern San Andreas and San Jacinto faults, offshore faults and the Los Angeles and Ventura basins, and the Eastern California Shear Zone. The moment budget could be balanced by coseismic events with a composite magnitude of Mw=8.

Testing the SCEC Community Velocity Model with Ray Tracing SoftwareMellors, Robert (SDSU), Kun Marhadi (SDSU),

and Victor Pereyra (Weidlinger Associates)

We test the use of the seismic ray tracing program Integra to calculate travel times and amplitudes for selected earthquakes in Southern California. Developed primarily for the petroleum industry, Integra can perform 3D ray tracing through isotropic heterogeneous blocky media and homogeneous transversely isotropic media. As an initial test, we have developed a preliminary version of a location algorithm that performs a multistep grid search based on travel times calculated using a simplified version of the SCEC community velocity model. Tests are performed for earthquakes in the Los Angeles basin (notably the 1994 Northridge event and aftershocks) and on the San Jacinto fault near Anza. We also compare observed and theoretical amplitudes of body phases.

160

Page 158: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

New Map of Quaternary Faults, Northern Mojave Desert: Do Previously Unrecognized Holocene Faults Partition Strain North toward

the Basin and Range?Miller, D.M., S.L. Dudash, L. Amoroso, G.A. Phelps, K.M. Schmidt, D. Lidke,

J.L. Redwine, D.R. Bedford, and C. Menges (USGS)

Dozens of previously unmapped east- and northwest-striking faults displaying Quaternary offsets have been identified by recent US Geological Survey 1:100,000-scale surficial geologic mapping. Systematic mapping covered the northern Mojave Desert between longitude 118 and 115 degrees. Collectively, these structures exhibit a pattern that is more complicated than previous models of discrete domains of like-trending structures. The newly mapped faults in some places differ in location and pattern from previously mapped faults, much as the recent Landers and Hector Mine fault ruptures diverged from previously mapped faults. Many faults strike north to northwest and express dextral offset, and another group of east-striking sinistral faults lies mostly in the NE area. However, local domains of northwest-striking dextral and east-striking sinistral faults are interleaved in complex patterns in the boundary between the regional domains, and outcrop relations indicate the fault sets act together and are contemporaneous.

Patterns of Holocene fault activity offer clues to the current strain partitioning. By study of outcrop alone, we cannot confidently assign Holocene ages to faults. However, conservative assessment, using only those faults in the northern Mojave Desert that display scarps in Holocene deposits, indicates three systems of active faults: (1) Lockhart, Mt. General, and Gravel Hills faults with NW strikes lie in the west, (2) a previously unmapped N-striking fault system (the Paradise fault zone) passes from Coyote Lake to Goldstone Lake, and (3) the E-striking Manix fault system (a complex of several faults) and newly identified Mesquite Spring fault lie in the east. Systems 1 and 2 are dextral and extend nearly to the Garlock fault, whereas system 3 is sinistral and extends east to the Soda Lake area, where Holocene strain may be focused on the Soda-Avawatz fault leading north to the Death Valley fault system. This pattern of Holocene faults has many tectonic implications, including: (1) the Blackwater fault may not be an important part of the Holocene strain budget, (2) expansive unfaulted regions separated by narrow corridors of active faulting provides constraints on middle crust strain partitioning, and (3) east-, north-, and northwest-striking faults are all active in strike-slip mode and occur in mutually interfingering domains, suggesting that vertical-axis block rotation is not currently a mechanism for strain accommodation. Geologic and geodetic studies are needed to test our results.

Interparticle Force Distributions Within Shearing Granular Fault GougeMorgan, Julia K. and Adam Halpert (Rice University)

Micromechanical simulations of granular gouge conducted in 2-D allow us to examine the spatial and orientation distributions of interparticle contact forces and their evolution with time. Modeled after laboratory experiments of Anthony and Marone (2005), fault gouge consists of rounded grains with diameters between 53 to 105 µm, in both well-sorted and poorly sorted arrangements, sheared between smooth and rough planar walls. Applied normal stresses of 5 or 10 MPa, are in the non-fracturing regime. Simulations are also carried out at several different shearing velocities to examine the velocity dependence of friction.

The normal and shear stresses acting on the shear zone walls are supported by complex networks of interparticle forces that span the gouge zones. Interparticle forces greater than average define inclined force chains that evolve as the gouge deforms, mapping the internal stress and deformation field. Boundary conditions strongly influence the stability of the force chains. Rough boundaries, constructed of rows of 500 µm diameter particles, couple with the gouge zone, causing distributed deformation and continuous reorganization of the force chains. Stress drops occur when major force

161

Page 159: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

chains fail and as in laboratory experiments, their magnitudes correlate directly with the mean size of the particles, or inversely with their abundance. In contrast, shear zones with smoother boundaries composed of rows of 10 µm particles, develop enduring boundary shears that preserve the internal force network. Stress drops accompany the onset and cessation of boundary shearing, and are not directly related to particle size. See poster for further details.

Simulation Discovery within Synthetic and Observed Seismogram Analysis (SOSA 1.1)

Muench, Joanna (IRIS), Philip Machling (USC), and Steve Doubleday (Less Computing)

The Synthetic and Observed Seismogram Analysis (SOSA) application has expanded its access to synthetic seismograms through the Data Discovery and Distribution System (DDDS). DDDS is a tool that enables search and retrieval of earthquake simulation products within the Storage Research Broker. SOSA uses DDDS to retrieve synthetic seismograms that can be processed within the application or saved to local disk. SOSA allows searches of earthquake simulations based on geographic location and magnitude. To retrieve synthetic seismograms associated with a simulation, a user can enter geographic coordinates or use the location of an existing seismic station. The integration of DDDS with SOSA provides users with an easy way to browse, analyze and save earthquake simulations created within the SCEC CME.

A Comprehensive 3-D Geophysical Model of the Crust in the Eastern Transverse Ranges

Needy, Sarah K. (SCEC/SURE Intern, IUPUI), Andrew P. Barth (IUPUI), J. Lawford Anderson (USC), and Kenneth L. Brown (IUPUI)

The Eastern Transverse Ranges provides an ideal sampling location for crustal information because of the exposure of a variety of crustal layers from varying depths. The easternmost areas and the adjacent south central Mojave Desert expose upper crustal rocks, and rocks further to the west were exhumed from progressively deeper levels, nearly to the present day average Moho. Eastern areas of the ranges contain mostly volcanic, sedimentary, and granitic rocks. Western areas contain mostly granite, granodiorite, tonalite, and gneiss with no volcanic or sedimentary rocks. The variety of rock types and range of crustal depths allows detailed estimation of geophysical characteristics for a representative in situ southern California crustal column.

This study is based on analyses of approximately 300 samples of basement rocks of the Eastern Transverse Ranges. P-wave velocity was calculated using Christensen and Mooney’s (1995) equation for velocity variation as a function of density and temperature (table below). P-wave velocity through these rock types varies with depth, pressure, and temperature. Velocities for average heat flow are listed only for regions where the rock occurs; a volcanic rock, like basalt, occurs only in the upper crust and will not have a middle crust velocity listed. Rock types occurring in both upper and middle crust, like granite, will have a velocity range listed for each depth region.

162

Page 160: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Rock Type Upper Crust Vp (km sec-1) Middle Crust Vp (km sec-1) basalt 6.6 – 6.7 quartzite 5.8 – 6.2 granite 5.6 – 6.1 5.6 – 6.1granodiorite 5.8 – 6.2 5.8 – 6.2gneiss 5.7 – 6.9amphibolite 6.8 – 7.6tectonite 5.8 – 6.5tonalite 6.2 – 6.5

Magnetic susceptibility was directly measured on hand specimens, and results follow two trends with increasing P-wave velocity. Generally, a positive correlation exists in the volcanic and granitic rocks, but no correlation is apparent for the metamorphic rocks.

These velocity and magnetic susceptibility values, based on surface samples, will be used to make a depth-dependent geophysical model using GIS. Depth assignments for surface samples were derived from pressure estimates based on the aluminum content of amphiboles in selected granitic rocks. The resulting model range of velocities and magnetic susceptibilities at varying depths in the crust can be compared to models constructed from aeromagnetic and seismic data.

SCEC/UseIT: Virtually Finding Your Way through CaliforniaNerenberg, Lewis (UCSC)

UseIT is a summer internship for undergraduates in computer science, earth science and other majors funded by the NSF and hosted at U. of Southern California.

The 'Grand Challenge' for the 22 interns this year was engineer an earthquake monitoring system by continuing the development of SCEC-VDO, a 3-D visualization program with capabilities for viewing earthquake catalogs, documented faults and other information. It also has a built-in ability to make movies containing all this information.

An important quality for any 3-D visualization program is the ability to know what you are viewing at all times. My work on the SCEC-VDO software has been primarily geared towards facilitating this sense of location. I spent a great portion of time transforming the Highways plugin from its former existence as a purely on/off option into an interactive GUI which can display whichever subset of highways you wish. This process proved more challenging than expected, since the previous code and source file were quite difficult to decipher, and I spent much time cleaning them up before new work could be added. This new functionality for the highways, as well as the introduction of a finer latitude-longitude grid, will significantly enhance the user's ability to relate earthquake data to human developments and communicate this knowledge through movie clips.

High-Resolution Stratigraphy and the Evolution of an Active Fault-Related Fold in 3D, Santa Barbara Channel, California

Nicholson, Craig, Christopher Sorlien, Sarah Hopkins, James Kennett (UC Santa Barbara), William Normark, Ray Sliter, and

Michael Fisher (USGS)

As part of a global climate study investigating the sedimentary record of Santa Barbara Basin, grids of high-resolution MMS analog, industry multichannel, and 2D high-resolution seismic reflection data (collected in separate cruises by the USGS and UCSB in 2002) are being used to correlate dated

163

Page 161: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

reference horizons from ODP Site 893 across the basin to the Mid-Channel Trend. The Mid-Channel Trend is an active oblique fold related to the offshore Oak Ridge fault system. Continuous late-Quaternary strata deposited in the deep paleo-bathymetric basin were uplifted and folded across this Mid-Channel anticline. Extrapolation of available age data suggest that strata as old as ~450 ka (OIS 12) appear to be exposed at the seafloor where they are now accessible to piston coring. The mapped horizons (dated at ~110 ka and ~160 ka, and including sequence boundaries interpolated at ~320 ka, ~420 ka, and ~475 ka) provide the basis for modeling the evolution of the structure and stratigraphy in 3D, and to precisely locate suitable core sites. In late August 2005, gravity and piston cores—together with deep-towed chirp data—will be taken using the R/V Melville to survey and sample these horizons and their intervening sequences where they are expected to crop out over the Mid-Channel Trend. Subsequent analyses of the cores and chirp data will be used to verify the predicted outcrop pattern and the basin-wide sequence stratigraphic interpretation. Thus, in addition to its contributions to Quaternary climate history, this project will help document the nature, geometry and evolution of the Mid-Channel anticline, its relation to the oblique Oak Ridge fault, and the local interaction between tectonics, climate, and sea-level change. To date, our results show that the Mid-Channel Trend has propagated from east to west as previously proposed. South of Santa Barbara harbor, folding on the anticline began about 1 Ma, while 10 km farther west, folding began after ~475 ka. Our sequence of multiple mapped reference horizons documents a fairly complicated process of how slip on the deep fault system is transformed at shallow levels into fold growth as different strands and back-thrusts become active. The active offshore Oak Ridge fault is thus mostly blind, despite offsetting the unconformity created during the Last Glacial Maximum.

Stream Channel Offset and Late Holocene Slip Rate of the San Andreas Fault at the Van Matre Ranch Site, Carrizo Plain, California

Noriega, Gabriela (UC Irvine), J Ramon Arrowsmith (ASU), Lisa Grant (UC Irvine), and Jeri Young (ASU)

Well-preserved channels offset across the San Andreas fault (SAF) at the Van Matre Ranch (VMR) site in the northwestern Elkhorn Hills area of the Carrizo Plain offer the opportunity to measure slip rate and examine geomorphic development of the channels. The fault zone and offset channels were exposed by excavation in one fault-perpendicular and five fault-parallel trenches. The geomorphology and stratigraphy in the channels reveal a record of filling by fluvial sedimentation, lateral colluviation, and pedogenesis. The buried thalweg of the currently active channel is offset 24.8 +/- 1 m, while the geomorphic channel is offset approximately 27.6 +/- 1 m. Seventeen samples were collected from channel margin deposits for 14C dating. An OxCal model of the radiocarbon dates with stratigraphic control suggests that the oldest date for channel incision was ~1160 AD. Minimum and maximum slip rates ranging from 29.5 to 35.8 mm/y result from different assumptions about the timing of channel incision and offset. Minimum slip rate results from using the maximum time interval from the earliest channel incision date to the present. Maximum slip rate is derived from a shorter time interval culminating in the most recent 1857 co-seismic offset. The range of slip rates at VMR agree well with the late Holocene slip rate of 33.9 +/- 2.9 mm/yr at Wallace Creek, approximately 18 km to the northwest, and imply that within measurement uncertainty the 30-37 mm/yr velocity gradient across the SAF from decadal time-scale geodetic measurements is accommodated across the several meter wide SAF zone at VMR over the last millenium.

164

Page 162: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Dynamic Rupture in the Presence of Fault Discontinuities: An Application to Faults in the Marmara Sea, Turkey

Oglesby, David D. (UCR), P. Martin Mai (ETH), Kuvvet Attakan (ETH), Stefano Pucci (ETH), and Daniela Pantosti (ETH)

The faults in the Sea of Marmara, Turkey, contribute greatly to the seismic hazard in this region. Of particular interest for the mega-city Istanbul is the fault system that straddles the northern rim of the Marmara basin, consisting of several, potentially linked faults, which have produced several devastating earthquakes (~ M 7.5) in the past 250 yrs (1912, 1766a,b, 1894). Geological analysis has indicated that the strike-slip Central Marmara Fault, the normal North Boundary Fault, and the strike-slip Izmit Fault form a linked fault system, where the North Boundary Fault acts as a linking normal fault in a dilational stepover between the two strike-slip segments. At this point it is not known whether this fault system tends to rupture in multi-segment, very large events, or whether it tends to generate multiple events on the individual segments. The dynamics of each of these scenarios with their associated ground motions will strongly affect seismic hazard estimates for this region.

In the present work we use the 3D dynamic finite element method to investigate whether this fault system could rupture in a single event, leading to a potentially large earthquake. Assuming that all faults intersect at depth and that there is a simple regional stress field, we find that typical earthquakes rupture across the stepover, leading to amplified slip, especially on the linking normal fault. The results are consistent with more generic models of fault stepovers in the presence of linking faults, and could have important implications for the capacity of similar fault systems to produce large, multi-segment earthquakes.

SCEC-CME Pathway 2 Computation of Synthetic Wave Propagation: Access to Capability Plus Availability of Software Codes

Okaya, David (USC), Vipin Gupta (SCEC), and Phil Maechling (SCEC)

Generation of synthetic seismic wave propagation within a subvolume of southern California is a major component of the SCEC-ITR project. Calculation of synthetic seismograms or propagating wavefields through the 3D subsurface is used to create PGV intensity maps, for comparison to observed seismograms at stations within southern California, to create Frechet kernels and corroborate Green's functions, and to iteratively improve the community velocity models of southern California. For these purposes, routine set-up and execution of wave propagation codes on fast computing facilities is desirable. Easy execution of a wave propagation code is made challenging due to the manual and time-intensive steps needed to prepare a simulation (including the creation of velocity meshes, source descriptions, and selections of stable code parameters). The framework of Pathway 2 is designed to replace the manual aspects of this setup and to also offer choices of which propagation codes and/or velocity models to use.

We announce the availability of a CME tool to create synthetic seismograms. A composional work tool (CAT) is the user interfact to define and submit a request. The tool allows for a choice in wave propagation code (Olsen 3D finite difference, Carnegie Mellon finite element, Graves 3D finite difference), velocity model (SCEC CVM 2.2 or 3.0, Harvard VM, Hadley-Kanamori, constant velocity, layer over a halfspace). The source description currently implemented uses double-couple point sources. The waveforms or post-processed intensity measure maps can be requested.

We also announce the availability of the scientific and utility applications used within this Pathway 2 tool. Download links are established within the CME web site http://www.scec.org/cme/downloads. We offer over 100 Fortran based codes and GMT scripts/files with the following functionalities: velocity model codes, creation of input meshes to the velocity model codes, create source files to

165

Page 163: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

AWM codes, create input files to AWM codes, calculate intensity measures, reformat synthetic waveforms, digital signal processing of waveforms, extract slices and data subsets, graphical maps using GMT, support CPT and California faults for GMT, georeference latitude/longitude-to-UTM conversions. A text-based metadata and command line parser is included. These codes will run individually or within the CME Pathway 2.

Flexible Steel Buildings and Puente Hills Scenario EarthquakesOlsen, Anna (Caltech) and Tom Heaton (Caltech)

As part of the “Rupture-to-Rafters” program, we use simulated earthquake time histories to estimate the seismic response of twenty-story, steel, moment-resisting frame buildings in the Los Angeles Basin. The ground motions were generated from the SCEC3D model of a Mw 6.8 earthquake on the Santa Fe and Coyote Hills Segments of the Puente Hills Fault. The peak simulated dynamic displacement was 1.8 m, and the peak simulated velocity was 3.5 m/sec. The twenty-story (78 m) building behavior is modeled with elastic-perfectly-plastic beam elements under the finite element method. Preliminary results predict an area of approximately 60 km2 south (up-dip) of the fault where a twenty-story building could experience an inter-story drift ratio exceeding 0.1. Also, several buildings experienced a permanent off-set of 1 m at the roof. Future work will use a more advanced material model which includes weld fractures, consider other building heights, and simulate the response to other scenario earthquakes.

TeraShake: Large-Scale Simulations of Earthquakes on the Southern San Andreas Fault

Olsen, Kim B. (SDSU), Steven M. Day (SDSU), J. Bernard Minster (UCSD), Yifeng Cui (SDSC), Amit Chourasia (SDSC), Yuanfang Hu (SDSC), Yi Li (SDSC)

Jing Zhu (SDSC), Marcio Faerman (SDSC), Reagan Moore (SDSC), Phil Maechling (USC), and Tom Jordan (USC)

We have carried out some of the largest and most detailed earthquake simulations completed to date (TeraShake), in which we model ground motions expected from a large earthquake on the southern San Andreas fault. The TeraShake calculations simulate 4 minutes of 0-0.5 Hz ground motion in a 180,000 km2 area of southern California, for a M 7.7 earthquake along the 199 km section of the San Andreas fault between Cajon Creek and Bombay Beach at the Salton Sea. The two segments of the San Andreas fault south of the 1857 rupture, the San Bernardino Mountains segment and the Coachella Valley segment, have not seen a major event since 1812 and about 1690, respectively. The average recurrence interval for large earthquakes with surface rupture on these segments are 146+91-60 yrs and 220+-13 yrs, respectively. In other words, a major component of the seismic hazard in southern California and northern Mexico comes from a large earthquake on this part of the San Andreas Fault.

The simulations include ruptures propagating both toward northwest and southeast on the fault, and the kinematic source model is based on that inferred for the 2002 Denali Earthquake. The crustal model is taken from the SCEC 3D Community Velocity Model Version 3, discretized into 200 m3 cubes. The results show that the chain of sedimentary basins between San Bernardino and downtown Los Angeles form an effective waveguide that channels Love waves along the southern edge of the San Bernardino and San Gabriel Mountains. Earthquake scenarios in which the guided wave is efficiently excited (scenarios with northward rupture) produce unusually high long-period ground motions over much of the greater Los Angeles region. Intense, localized amplitude modulations arising from variations in waveguide cross-section can be explained in terms of energy conservation in the guided mode. Less certain than the spatial pattern are the predicted absolute amplitudes of the ground

166

Page 164: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

motion extremes, as nonlinearity induced by the higher-than-anticipated waveguide amplifications we have identified here would likely cause significant reduction of both shear modulus and Q factor in the near-surface layers.

The high resolution of the TeraShake simulations reported in this study was made possible by the availability of substantial computational resources. The simulations required up to 19,000 CPU hours on 240 processors of the 10 teraflops IBM Power4+ DataStar supercomputer at San Diego Supercomputer Center (SDSC). The fourth-order finite-difference code uses high performance parallel I/O - multiple tasks to write to the same file simultaneously, and exhibits a significant speedup factor of 225 when executed on 240 processors. A large database of synthetic seismograms from TeraShake are now available online at the Storage Resource Broker (SRB) at SDSC (http://www.scec.org/cme/TeraShake) for earthquake engineers, seismologists and city planners to use, as well as a wealth of animations of the simulated wave propagation (http://visservices.sdsc.edu/projects/scec/terashake). In the future, the modeling team plans to integrate physics-based spontaneous fault rupture models to initiate the simulations, which may provide more realistic rupture propagation compared to the kinematic source approximation used here.

SCEC/UseIT: Georeferencing & Draping Images Over DEMsOoi, Ee Ling (USC)

The Undergraduate Studies in Earthquake Information Technology (UseIT) is a program under the Southern California Earthquake Center (SCEC) that integrates computer technology with earthquake science. Its newest product, SCEC-VDO, is a piece of earthquake visualization software created by the intern group. Part of the summer’s grand challenge was to create an earthquake monitoring system by improving the existing functions and at the same time adding many new functions to the application.

I participated in improving the surface plugin and was responsible for designing a user-friendly interface for the plugin. Although the plugin worked before, it was not an easy process to add an image correctly georeferenced. I redesigned the Graphical User Interface (GUI) and added a new function for the application to save previously loaded images, Digital Elevation Models (DEMs), and images draped on DEMs. Thus, the user will only have to go through the georeferencing process once and will be able to reuse the image multiple times.

Now, the process is easier and more intuitive to the end-user. This will ultimately speed up the visualization and movie making processes and help produce effective real-time movies.

Interseismic GPS Time-Series Patterns in the Ventura Basin and Preliminary Comparisons to 3D Mechanical Models

Owen, Susan (USC/JPL), Michele Cooke (UMass), and Scott Marshall (UMass)

Geodetic data from the Southern California Integrated Geodetic Network (SCIGN) provides information on interseismic deformation patterns in the Ventura Basin, which can be compared to results from three-dimensional mechanical models to test differing interpretations of active fault configurations. The Ventura basin is one of the fastest converging areas within southern California, but the geodetic signal is complicated by strong effects from groundwater injection and withdrawal. In order to accurately estimate interseismic velocities, we first apply a regional filter to the time series to remove noise that is spatially correlated across the network. We then model and remove the primary seasonal signal in the time series by finding the best-fitting sine functions for each time series component. Lastly, we use a principle component analysis to remove any remaining non-periodic

167

Page 165: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

time-varying signal due to groundwater changes. The interseismic signal resulting from the geodetic post-processing is compared to that resulting from three-dimensional Boundary Element Method (BEM) models that use fault surfaces defined by the Community Fault Model within the Ventura basin.

We establish two sets of BEM models, one with remote contraction determined from relative displacement of distal SCIGN stations and one with N-S contraction, similar to that applied in the Los Angeles basin. For each remote contraction, fault slip rates across geologic time-scales are determined within BEM models that extend fault surfaces at depth to a 27km deep freely slipping horizontal crack representing the Mohorovic Discontinuity. To assess the suitability of the 3D model, fault slip rates and rakes from the 3D geologic time-scale model are compared to available paleoseismic rates. Our preliminary results suggest that fault slip rates within the model are generally within the range of published paleoseismic rates for the tectonic contraction directions tested. We simulate interseismic deformation by prescribing geologic slip rates along fault surfaces below the seismogenic locking depth (18km) in order to determine surface interseismic velocities; faults above the locking depth are locked. Features of the modeled interseismic deformation pattern resemble that of the geodetic data. Further analysis will include incorporation of a compliant material within the sedimentary basin and tests of the sensitivity of surface interseismic velocities to roughness of fault topology.

Extreme Value Statistics for Near-Fault Ground MotionsOzbey, Mehmet (UT Austin) and Ellen Rathje (UT Austin)

Random vibration theory is a powerful stochastic ground motion simulation tool when the peak values of ground motion are desired. Random vibration theory uses extreme value statistics (EVS) to predict the peak values of ground motion. One of the underlying assumptions of EVS is that the time series of interest is stationary. This assumption is always violated for earthquake generated motions; however, several studies have shown EVS to be successful in predicting the peak values of strong ground motion (Hanks and McGuire 1981, Boore 1983, McGuire et al. 1984). Previous validations of EVS used datasets of earthquake motions from a wide range of magnitudes and distances, but the effect of magnitude, distance, and site conditions on the predictions was not examined. These effects may be significant, particularly for near-fault (R<20 km) motions.

This study compares peak factors predicted by EVS with those measured in strong motion recordings. The peak factor is defined as peak value / arms , where peak value is PGA or spectral acceleration, and arms is the root-mean-square acceleration or the root-mean-square spectral acceleration. Calculation of arms requires the definition of strong motion duration. There exists over 20 different definitions of strong motion duration, and we used several definitions of strong motion duration for the calculation of peak factors. To validate EVS for near-fault effects, the peak factors of 85 near-fault and 80 far-field strong motion recordings were calculated and compared with their theoretical values calculated by using EVS. The effect of magnitude, distance, station orientation, site condition, and rupture direction on the peak factors is also studied.

168

Page 166: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Model Uncertainty, Earthquake Hazard, and the WGCEP-2002 ForecastPage, Morgan (UCSB) and Jean Carlson (UCSB)

Model uncertainty is prevalent in Probabilistic Seismic Hazard Analysis (PSHA) because the underlying statistical signatures for hazard are unknown. While it is well-understood how to incorporate parameter uncertainty in PSHA, model uncertainty is more difficult to incorporate due to the high degree of dependence between different earthquake-recurrence models. We find that the method used by the 2002 Working Group on California Earthquake Probabilities (WG02) to combine the probability distributions given by multiple models has several adverse effects on their result. In particular, taking a linear combination of the various models ignores issues of model dependence and leads to large uncertainties in the final hazard estimate. Furthermore, choosing model weights based on data can systematically bias the final probability distribution. The weighting scheme of the WG02 report also depends upon an arbitrary ordering of models. In addition to analyzing current statistical problems, we present alternative methods for rigorously incorporating model uncertainty into PSHA.

Directly Probing Fault Zone Rheology: Systematic Variations in Recurrence Interval and Moment of Repeating Aftershocks

Peng, Zhigang (UCLA), John E. Vidale (UCLA), Chris Marone (Penn State), and Allan Rubin (Princeton)

The recurrence intervals for 194 repeating clusters on the Calaveras fault follow a power-law decay relation with elapsed time after the 1984 M6.2 Morgan Hill, California, mainshock. The decay rates of repeating aftershocks in the immediate vicinity of a high-slip patch that failed during the mainshock systematically exceed those that are farther away. The trend between relative moment and recurrence interval, which is a measure of the fault-healing rate, varies systematically with depth and changes from negative to positive value as the distance between the repeating aftershock and the mainshock slip patch increases. We speculate that high strain rate in the early postseismic period may cause transient embrittlement and strengthening of the deep repeating clusters in the areas adjacent to the mainshock slip patch, resulting in large moments that decrease with time as the strain rate diminishes. Our observations suggest that systematic behavior of repeating aftershocks reflect variations in the fault zone rheology.

Early Aftershock Decay RatesPeng, Zhigang (UCLA), John E. Vidale (UCLA), Miaki Ishii (UCSD), and

Agnes Helmstetter (Columbia)

Main shock rupture is typically followed by aftershocks that diminish in rate approximately as the inverse of the elapsed time since the main shock [Omori, 1894; Utsu et al., 1995]. However, it is notoriously difficult to observe early aftershock activity in the noisy aftermath of large earthquakes. Many aftershocks are missing in existing seismicity catalogs in the first few minutes [Kagan, 2004]. Yet this period holds valuable information on the transition from main shock to aftershocks, the underlying earthquake physics that control the occurrence of aftershocks and earthquake triggering. The seismicity rate right after a main shock is also important for short-term seismic hazard forecasting [Helmstetter et al., 2005].

We have analyzed stacked seismicity rate of 83 M 3-5 shallow earthquake sequences in Japan from waveforms recorded by the Hi-net borehole array. By scrutinizing the high-frequency signal, we have detected up to 5 times more aftershocks in the first 200 s than in the Japan Meteorological Agency catalog. After correcting for the changing completeness level immediately after the main shock, the

169

Page 167: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

aftershock rate shows a crossover from a slow decay ~1/tp with p = 0.5-0.6 at short times (t<200 s), to a faster decay p = 0.8-0.9 at larger times. This trend can be explained by the rate-and-state model of [Dieterich, 1994], for a heterogeneous stress change. Other possible mechanisms consistent with data include strong shaking induced by the main shock, fault healing, fluid diffusion, or aseismic slip.

In comparison, the seismicity rate for the stacked foreshock sequences follows an inverse Omori's law with a p value of ~0.7-0.8 from several hundred days up to the main shock occurrence time, although the power-law acceleration is barely seen on individual sequences. The same method is applied to several recent earthquake sequences in Southern California. The obtained results are consistent with our study in Japan.

SCEC/UseIT: Earthquake VisualizationPerez, Justin (USC)

The UseIT program is a collection of undergraduate interns working in software development. Past UseIT interns created SCEC-VDO, a plugin-based Java3D program used to visualize earthquakes in California and around the world. SCEC-VDO was an upgrade of previous software - LA3D - which did not use plugins and was restricted to Southern California. By expanding on SCEC-VDO, our summer “grand challenge” was to create software to monitor earthquake activity.

Entering the summer with no graphics experience, the biggest challenge was learning Java3D. In SCEC-VDO, my area of concentration was expanding earthquake visualization functionality. The existing earthquake display lacked object-oriented implementation. Fellow intern Ifraz Haqque and I changed the plugin’s architecture so every earthquake is an object. This was essential for functionality added, including animating earthquakes and making earthquakes clickable to display information. I also worked on adding the ability to display focal mechanisms in SCEC-VDO using two intersecting discs to represent the fault and auxiliary planes. Joshua Garcia and I used LA3D code as a reference to build the feature into SCEC-VDO. I then added a new catalog format which includes the probability that a specific plane is the fault plane for the event. Working with Kristy Akullian and under the guidance of Tom Jordan, a new visual style was created to display this data. This probability catalog was created by Po Chen, a graduate student working with Tom Jordan.

SCEC/UseIT: Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology – The Next Generation of Internship

Perry, Sue (SCEC)

The SCEC/UseIT internship program is training the next generation of earthquake scientist, with methods that can be adapted to other disciplines. UseIT interns work collaboratively, in multi-disciplinary teams, conducting computer science research that is needed by earthquake scientists. Since 2002, the UseIT program has welcomed 64 students, in some two dozen majors, at all class levels, from schools around the nation. Each summer’s work is posed as a ‘Grand Challenge.’ The students then organize themselves into project teams, decide how to proceed, and pool their diverse talents and backgrounds. They have traditional mentors, who provide advice and encouragement, but they also mentor one another, and this has proved to be a powerful relationship. Most begin with fear that their Grand Challenge is impossible, and end with excitement and pride about what they have accomplished.

170

Page 168: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

The 22 UseIT interns in summer, 2005, were primarily computer science and engineering majors, with others in geology, mathematics, English, digital media, physics, history, and cinema. The 2005 Grand Challenge was to “build an earthquake monitoring system” to aid scientists who must visualize rapidly evolving earthquake sequences and convey information to emergency personnel and the public. Most UseIT interns were engaged in software engineering, bringing new datasets and functionality to SCEC-VDO (Virtual Display of Objects), a 3D visualization software that was prototyped by interns last year, using Java3D and an extensible, plug-in architecture based on the Eclipse Integrated Development Environment. Other UseIT interns used SCEC-VDO to make animated movies, and experimented with imagery in order to communicate concepts and events in earthquake science. One movie-making project included the creation of an assessment to test the effectiveness of the movie’s educational message. Finally, one intern created an interactive, multimedia presentation of the UseIT program.

The PaleoSites Component of the California Reference Geologic Fault Parameter Database: In Service of a Statewide Uniform

California Earthquake Rupture Forecast Perry, Sue (SCEC), Vipin Gupta (SCEC), and Nitin Gupta (SCEC)

An important current effort in seismic hazard analysis is a USGS, CGS, and SCEC collaboration to lead a new Working Group on California Earthquake Probabilities (new WG) in developing a statewide Uniform California Earthquake Rupture Forecast (UCERF, discussed throughout this meeting by Field, Petersen, Wills, and others). We have constructed a site-specific fault database which will be programmatically accessed by UCERF modelers. This database will be populated by fault geologists who will contribute their best estimates of activity parameters for study sites on key faults. To facilitate contributions, we are developing a straightforward GUI that we will demonstrate during this meeting to obtain feedback from potential users. Our database and GUI have been created specifically to meet the needs of UCERF modelers. However, wherever possible, we have drawn on aspects of existing fault-related databases, each of which serves different purposes and users. In particular, we have refined parameters and data constructs based on the expertise of the National Quaternary Fault and Fold database project, and have adapted the contributor GUI and style of programmatic access developed for the SCEC Fault Activity Database. We make our efforts in close coordination with the USGS, to ensure interoperability with their growing suite of fault-related databases.

Database parameters were developed with significant input from the newWG, particularly Ned Field, Glenn Biasi, Bill Bryant, Kathy Haller, and Chris Wills. The conceptual data model was discussed in detail during a two-day SCEC workshop that we organized in spring, 2005, that was attended by all cited here, as well as Jim Dolan, Russ Graymer, Lisa Grant, Mike Oskin, and Nathan Niemi.

SCEC/UseIT: Displaying the Anza Seismic GapPhillips, Kristin (UCSD)

This summer, the SCEC UseIT program brought together students of varied computer science and earth science backgrounds to work on a grand challenge that involved building an earthquake monitoring system to analyze earthquake sequences rapidly as new earthquakes occur and have constantly expanding data sets. In the pursuit of this goal, we have been expanding three-dimensional visualization software called SCEC-VDO which allows things like earthquakes and faults to be displayed and now has an expanded range of features. One of my main tasks was researching and displaying the Anza seismic gap, a region on the San Jacinto fault where scientists have observed a deficiency in seismicity and slip compared to other segments of the San Jacinto fault.

171

Page 169: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Consequently, some scientists have estimated that there is a probability that a large earthquake (M 6 – 6.5) could occur in the area. Due to recent earthquakes that occurred near Anza and Yucaipa in June 2005, the Anza Gap has become a place of interest. I researched the Anza Gap and then assisted in the creation of three-dimensional boxes showing the Anza Gap region as portrayed in scientific papers. A difficulty in the project was not having exact coordinates to use but having to rely on rough estimates from small scale figures. The resulting boxes are partially transparent so that it is possible to observe whether earthquakes occur in the Anza gap region. This project could be useful in the prediction and analysis of earthquakes as they occur.

Investigation of Shallow Crustal Weak Zone Effects on Rupture Dynamics of Surface and Subsurface Faulting

Pitarka, Arben (URS Corp.) Steven Day (SDSU), and Luis Dalguer (SDSU)

The characterization of kinematic fault rupture models that are currently used in ground motion prediction is not fully constrained by physical rupture conditions. One important observation that needs to be accommodated by the source models used in simulating broadband ground motion is the difference in ground motion characteristics between shallow and deep faulting.

The cause of the observed differences in ground motion level and frequency content produced by surface and subsurface rupture is mainly due to differences in fault rupture dynamics in the weak zone, shallow crust (upper 4 km) and middle region of the crust. These differences in rupture process are linked with contrasts in dynamic stress drop, rupture velocity and slip velocity between these two depth ranges.

Using rupture dynamic models of M6.5 strike slip scenario earthquakes on faults that break and do not break the surface we investigate the effect of the velocity strengthening in the weak zone on near fault ground motion from surface and subsurface rupture in the frequency range of 0.1-2Hz. We will show results of our 3D finite-difference simulations and discuss their potential use for deriving constraints on maximum values and depth variation of kinematic parameters such as slip velocity and rise time in the weak zone.

The Community Fault Model Version 2.5 and Associated Models Plesch, Andreas (Harvard), John H. Shaw (Harvard), Kristian J. Bergen (Harvard), Caitlin S. Bergin (Harvard), Chris Guzofski (Harvard), George Planansky (Harvard)

and the SCEC CFM Working Group (various)

We present new versions of the Community Fault Model (CFM) and the Community Block Model (CBM), as well as a newly derived rectilinear community fault model (CFM-R) and a CFM-compatible fault trace map. The CFM was substantially updated from previous versions in response to a Virtual CFM Evaluation Workshop held earlier this year. New or substantially updated fault representations in CFM 2.5 include the source of the San Simeon earthquake, the Rinconada fault, the San Joaquin fault, the Santa Monica Bay fault, the San Antonio Canyon fault, a steeply dipping alternative representation of the Santa Monica fault, and numerous small modifications resulting from generating alternative cross-cutting fault relations. In addition, quality factors for fault alternatives assigned during the Workshop were used to define a preferred fault model and a series of comprehensive alternative models.

The Community Block Model (CBM) is currently being used by members of the Fault Systems Group to develop a volumetric mesh (C. Gable). We have worked with this group to help define the CBM

172

Page 170: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

and evaluate test meshes, and have generated a complete frame model and an equiangular, tetrahedral mesh for the CBM micro-block model in the northern Los Angeles basin.

In response to requests from fault model consumers, we have also generated a new rectilinear fault model (CFM-R) and a fault trace map that is based on the CFM 2.5. The basic process of deriving the rectilinear fault model involved measuring representative dips of fault representations, simplifying fault traces to approximately 5km node spacing, extruding each fault segment perpendicular to its strike according to the representative dip in the CFM, and cutting the resulting patch at a regional base-of-seismicity surface. For steeply dipping, straight faults this process leads to satisfactory results. For other, more complex faults, local dips and segment endpoints were adjusted to most closely resemble the original fault geometry represented in the CFM. Currently, each fault segment in CFM-R is a quadrilateral with two parallel sides. However, othe planar representations of these fault sections can be readily derived from these quadrilaterals to support planned uses in fault systems modeling and seismic hazard analysis. The fault trace map simply includes the traces of CFM 2.5 fault representations, which are generally a smoothed version of traces provided on detailed geologic maps. The upper tip line of blind thrusts are shown in their vertically projected position.

The web interface to the CFM database (http://structure.harvard.edu/cfm) was reengineered to provide a unified storage system in the database that holds both geometry and attribute data, and to distribute the CFM and associated models.

Waveform Tomography of Crustal Structure in the South San Francisco Bay Region

Pollitz, Fred (USGS) and Jon Fletcher (USGS)

We utilize a scattering-based seismic tomography technique to constrain crustal structure around the south San Francisco Bay region. This technique is based on coupled traveling-wave scattering theory. Using three-dimensional sensitivity kernels, this technique is applied to observed body and surface waves of intermediate period (3 to 4 sec dominant period) observed following eight selected regional events. We use a total of 73 seismograms recorded by a USGS short period seismic array in the western Santa Clara Valley, the Berkeley Digital Seismic Network, and the Northern California Seismic Network. Modeling of these waves is done with the following steps:

(1) Traveling waves up to a maximum horizontal phase velocity of 100 km/sec are synthesized on a reference laterally homogeneous model.

(2) 3D perturbations in shear modulus, bulk modulus, and density are prescribed in terms of smooth basis functions over a 75 x 75 km2 x 24 km volume. Scaling relations among the various perturbations are used to reduce the number of parameters.

(3) Theory of coupled traveling waves (Friederich, 1999) is used to derive differential seismograms, which are rendered fully three-dimensional by iteratively using full wavefield potentials (i.e., taking successive terms in the Born series).

These steps allow estimation of the perturbation in S-wave velocity in the considered volume. There is sufficient sensitivity to both deep and shallow crustal structure that, even with the few sources employed in the present study, we obtain shallow velocity structure which is reasonably consistent with previous P-wave tomography results. We find a depth-dependent lateral velocity constrast across the San Andreas Fault (SAF), with higher velocities southwest of the SAF in the shallow crust and higher velocities northeast of the SAF in the mid-crust. The method does not have the resolution to identify very slow sediment velocities in the upper approximately 3 km since the tomographic models are smooth at a vertical scale of about 5 km.

173

Page 171: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

A 3-Dimensional Model of Quaternary-Age Sequences in the Dominguez Gap Region, Long Beach, California

Ponti, Daniel J. (USGS), Kenneth Ehman (Skyline Ridge), Brian Edwards (USGS), John Tinsley (USGS), Thomas Hildenbrand (USGS), John W. Hillhouse (USGS),

Randall T. Hanson (USGS), Kristen McDougall (USGS), Charles L. Powell II (USGS), Elmira Wan (USGS), Michael Land (USGS), Shannon

Mahan (USGS), and Andrei M. Sarna-Wojcicki (USGS)

A 3-dimensional computer model of the Quaternary sequence stratigraphy in the Dominguez gap region of Long Beach, California, has been developed to provide a robust chronostratigraphic framework for tectonic and hydrologic studies. The model consists of 13 layers within a 16.5 by 16.1 km square area and extends downward to an elevation of –900 meters. Ten sequences of late Pliocene to Holocene age are identified and correlated within the model. Primary data to build the model comes from five reference core holes, extensive high-resolution seismic data obtained in San Pedro Bay, and logs from several hundred water and oil wells drilled in the region. The model is best constrained in the vicinity of the Dominguez gap seawater intrusion barrier where a dense network of subsurface data exist. The resultant stratigraphic framework and geologic structure differs significantly from what has been proposed in earlier studies.

An important new discovery from this approach is the recognition of ongoing tectonic deformation throughout nearly all of Quaternary time that has impacted the geometry and character of the sequences. Anticlinal folding began 300-450 ky ago along a NW-SE trend whose crest is displaced slightly north of the Wilmington anticline. A W-NW trending fault that approximately parallels the fold crest has also been discovered. This fault intiated ~600 ky ago and progressively displaces all but the youngest sequences down to the north. There is no direct evidence of fault displacement in sediment younger than 80 ka, but surface geomorphology indicates that associated folding has continued into the latest Pliestocene, possibly extending into the Holocene. Although apparent normal faulting is all that is directly observed, the fault geometry suggests that it is likely a right-stepping en echelon shear zone that likely intersects the Newport-Inglewood fault zone just south of Signal Hill. The newly discovered structure possibly served or serves to bleed slip off of the Newport-Inglewood zone and into the west Los Angeles basin.

Cost-Effectiveness of Seismically Better Woodframe HousingPorter, Keith (Caltech), Charles Scawthorn (Kyoto University), and

Jim Beck (Caltech)

Mapping the benefits of stronger dwellings. Economic benefits of enhancing the seismic strength and other attributes of four fully-designed woodframe dwellings were calculated, both in terms of benefit-cost ratio and the present value of reduced future earthquake repair costs. Maps are presented for the probabilistic seismic risk of the homes examined under the CUREE-Caltech Woodframe Project, using seismic hazard information calculated by the US Geological Survey. The analysis employed new full-scale test data and analytical tools; rigorously propagated important sources of uncertainty; used dynamic time-history structural analyses; and clearly accounted for repair costs. It is perhaps the first such analysis to be completely analytic and transparent, with no reliance on expert opinion. Seven retrofit and redesign measures were considered, as well as the benefit of higher construction quality.

Benefits can reach 8 times the initial construction cost. One retrofit exhibits benefit-cost ratios as high as 8, and exceeds 1 in half of California ZIP Codes. Four retrofit and redesign measures were estimated to be cost effective in at least some California locations—generally near faults and on soft soil, as expected. When examining the benefit of higher-quality construction, we found that the

174

Page 172: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

savings in terms of reduced seismic risk can be substantial, with median savings on the order of $1,000 to $10,000 over 30 years, suggesting a quantitative argument for frequent construction inspection.

Greater benefits than previously estimated. These results ignore benefits such as reduced contents damage, reduced loss of use, and human injuries avoided. That is, benefits were taken as simply the reduced repair costs, without accounting for these other resulting losses. Were these other benefits included, benefit-cost ratios would be substantially greater. These other benefits are easily included in the methodology, given the appropriate data. Including demand surge, which was ignored here, would also increase estimated benefits. The findings are particularly interesting in that past studies, made on only partially rational bases, have rarely found benefits to exceed costs. It can be inferred that benefits of enhanced seismic reliability may be significantly greater than previously estimated.

Use in making risk-management decisions. The data presented here can be used to inform risk-management decisions by homeowners, engineers, and public officials. Homeowners can use the information to decide if retrofit is likely to be worth the expense. Engineers can use the data in the development of code requirements. Public officials can use the data to target particular dwelling types and geographic locations for public-awareness programs that promote retrofit where it is likely to be cost effective.

Seismicity Rate vs. Distance from Strike-Slip Faults in Southern CaliforniaPowers, Peter M. (USC) and Thomas H. Jordan (USC)

We use three high resolution catalogs (Hauksson[2003], Shearer[2003], and SCSN) and detailed fault representations of the SCEC Community Fault Model (CFM) to constrain seismicity rates perpendicular to strike-slip faults in southern California. Using a stacking process for earthquakes in regions proximal to major, linear fault segments, we find that the cumulative number of earthquakes a~d(-) where d is distance from a fault and =0.85±0.05. We verified our result by stacking across multiple spatial and magnitude ranges with various normalization methods. This value holds out to 7-8km from a fault, beyond which 'background' seismicity dominates. Stacking across increasing lower-magnitude cutoffs indicates that b-value remains constant away from a fault and that b=1. On the basis of this result, we hypothesize that aftershocks of an earthquake away from a fault should be biased towards and along the fault. To test this hypothesis, we filter our fault segment sub-catalogs for mainshock-aftershock sequences using reasonable time and distance windows and stack them on the mainshocks in 2km wide bins away from the fault. Stacks of various mainshock magnitude ranges (within a M2.5 – 4.5 range) show that aftershocks are biased towards faults. This result compares well with a model that couples our seismicity-distance scaling relation with the observation that earthquake aftershock density d~r (-2) where r is distance from a mainshock. These data suggest that we can improve seismic triggering models by incorporating finer details of the relationship between seismicity and fault structure.

Empirical Green’s Functions: Uncertainty Estimation of the Earthquake Source Spectrum and Source Parameters

Prieto, G. A. (UCSD), F. L. Vernon (UCSD), and P. M. Shearer (UCSD)

The study of earthquake scaling, including estimates of source parameters such as corner frequency, apparent stress and radiated energy, is currently in dispute. In particular, estimates of earthquake radiated energy, which depend on accurate estimates of the high-frequency part of the signal, may have large uncertainties. The current state-of-the-art technique is to use a small earthquake to provide Empirical Green’s Functions (EGF) for a larger event. The assumption in this case is that the spectrum of the small event represents the transfer function of the path between the source and

175

Page 173: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

receiver. In the frequency domain, deconvolution is a simple spectral division, but since in fact we are dividing two random variables (the spectra), each with a certain associated variance, the spectral division will generate a new random variable whose variance is a function of the variance of the mainshock and the EGF. The variance or uncertainty associated with the EGF-corrected source spectrum turns out to be quite large, of at least half-oder of magnitude in spectral amplitude. Many spectral source models can be used to fit the data within the uncertainties. We propose the use of multiple EGF to deconvolve target events, and use a mean-square error criteria to construct a new source spectrum from a linear combination of the EGFs. By increasing the number of degrees of freedom, the variance of the resultant source spectrum is greatly reduced, and thus the uncertainty of source parameters is also reduced. We apply this approach to the October 31 2001 M=5.1 Anza earthquake at local distances to estimate radiated energy and its uncertainties.

Surface Wave Anisotropy and Off Great-Circle Propagation in a Dense Array: Southern California

Prindle, Kenton (UCSB) and Toshiro Tanimoto (UCSB)

We recover azimuthal anisotropy and updated S-wave velocity structure for well resolved regions in Southern California using Rayleigh wave data attained from the California Integrated Seismic Network. Initial results show strong anisotropy in Southern California, particularly near the Transverse Ranges, with orientation of the fast-axis being approximately margin parallel. Resolution tests have been carried out, illustrating that only long wavelength features (1.0 - 1.5 Degrees) can be reasonably recovered. There are large contributions due to azimuthal anisotropy seen in amplitude of S-wave velocity perturbations, but overall velocity features remain intact from out previous inversions. The strongest anisotropy anomaly occurs near the Transverse Ranges, where a postulated fast velocity root existed in our original S-wave veolicity inversion.

Particle motions were analyzed in order to ensure our great-circle path assumption was adequate to explain the travel path between two stations. Initial results show that for most incoming travel path directions our assumption is valid, but for certain azimuths, particle motions deviate considerably from expected values, most likely due to refractions off of local or travel path structures. More work is necessary in order to examine these deviations in more detail.

Attenuation Relation Consistency with Precariously Balanced Rocks Purvance, Matthew (UNR), James Brune (UNR), and

Rasool Anooshehpoor (UNR)

Purvance (2005) has shown that the overturning responses of precariously balanced rocks (PBRs) depend on the PBR shapes, sizes, and the vector of ground motion intensity measures including PGA and either PGV, SA@1sec, or SA@2sec. This formulation allows one to quantify the overturning probability of an object that has been exposed to a ground motion time history with specified peak parameters (e.g. PGA and PGV). Vector-valued PSHA (VPSHA) provides a methodology to estimate the rate of occurrence of all ground motions. Combining the VPSHA output with the PBR fragility results in the rate of PBR overturning. In this way, one can compare the PBR age with the estimated failure date given the assumptions inherent in the VPSHA calculation. This study focuses on the variability of estimated failure dates of PBRs as a function of attenuation relations utilized in the VPSHA calculations. The following attenuation relations have been investigated: Abrahamson and Silva (1997), Boore et al. (1997), Abrahamson (2005), and Gregor et al. (2002). The Boore et al. (1997) and Abrahamson (2005) relations produce ground motion models that are more consistent with the PBRs in southern California than the Abrahamson and Silva (1997) and Gregor et al. (2002) relations. It is apparent that the addition of recent data (e.g. Chi-Chi, Izmit) in the development of the

176

Page 174: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Abrahamson (2005) attenuation relation significantly affects the VPSHA output. These findings highlight the utility of this methodology to test alternate attenuation relations such as those produced by the NGA initiative.

Shake Table Validation of Precariously Balanced Rock MethodologyPurvance, Matthew (UNR), Rasool Anooshehpoor (UNR), and

James Brune (UNR)

A series of shake table experiments have been carried out at the University of Nevada, Reno Large Scale Laboratory. The bulk of the experiments involved scaling acceleration time histories (uniaxial forcing) from 0.1g to the point where all of the objects on the shake table overturned a specified number of times. In addition, a biaxial experiment was performed for comparative purposes with the uniaxial results. Objects tested ranged in height from ~ 20 cm to about 120 cm and height/width ratios from ~ 10 to ~ 2. The acceleration time histories utilized include strong motion recordings of 1979 Imperil Valley, 1985 Michoacan, 1999 Izmit, 1999 Chi-Chi, 2002 Denali, and 2003 Hokkaido earthquakes and synthetic acceleration time histories (full sine pulse and random vibration records). The results of these experiments have been compared with overturning predictions based on the work of Purvance (2005).

Field Imaging Spectroscopy: A New Methodology to Assist the Description, Interpretation and to Archive Paleoseismological

Information from Faulted ExposuresRagona, Daniel (SIO-UCSD), Bernard Minster (SIO_UCSD),

Thomas Rockwell (SDSU), Yuri Fialko (SIO-UCSD), Jouni Jussila (Specim), and Ronald Bloom (JPL)

We are developing a new methodology to acquire, interpret and store stratigraphic and structural information from geological exposures, in particular for paleoseismology. Portable hyperspectral sensors collect high-quality spectroscopic information at high spatial resolution (pixel size ~ 0.5 mm at 50 cm) over frequencies ranging from visible to short wave infrared. These sensors possess two primary advantages over previous methods: first, quantitative analysis of the spectra allow discrimination of features hardly visible or invisible to the human eye; and second, the images can be archived digitally for future analysis. These advantages will be of great benefit to paleoseismological studies that require a high-resolution stratigraphic and structural description to resolve the earthquake history of a site. At the present, most field data are collected descriptively, which allows for significant variations in description of the sediment properties and stratigraphic relationships. The descriptions are documented on hand drawn field logs and/or photomosaics constructed from individual photographs. The new data collection and interpretation methodology that we are developing (Field Imaging Spectroscopy) makes available, for the first time, a tool to quantitatively analyze paleoseismic and stratigraphic information. The reflectance spectra of each sub-millimeter portion of the material are stored in a 3-D matrix (hyperspectral cube) that can be analyzed by visual inspection, or by using a large variety of algorithms. The reflectance spectrum is related to the chemical composition and physical properties of the surface; therefore hyperspectral images are capable of revealing subtle changes in texture, composition and weathering. For paleoseismic studies, we are primarily interested in distinguishing changes between layers at a given site (spectral stratigraphy) rather than the precise composition of the layers, although this is an added benefit. We have experimented with s push-broom (panoramic) portable scanners, and acquired data form portions of fault exposures and cores. These images were processed using well-known imaging processing algorithms, and the results have being compared with field descriptions and digital photography. We have shown that SWIR images can enhance layers that are not easily seen and also make visible

177

Page 175: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

many features that are not visible to the human eye. Hyperspectral images also improved the results of stratigraphic correlations across the faults by using quantitative methods (spectral comparison) and image enhancing techniques.

A 1.5 Hz Puente Hills Fault Rupture Simulation Ramirez-Guzman, Leonardo (CMU), Tiankai Tu (CMU), Hongfeng Yu (UCD), Julio

Lopez (CMU), Jacobo Bielak (CMU) and David O'Hallaron (CMU)

We present a detailed analysis of a probable Puente Hills fault rupture scenario. There is an increasing interest in identifying the human and economic impacts a big earthquake in this zone could cause. We simulate a 7.1 Mw source up to 1.5 Hz using the SCEC 3D Velocity Model 3 (CVM) and our new finite element solver called HERCULES. The 500 million degrees of freedom required are of the same order as the largest ground motion simulation done to date. We study the wave field within LA basin and obtain spectral maps in the free surface to identify areas prone to high damage.

Grid-Search for Accelerated Moment Release (AMR) Precursors to California Earthquakes

Reissman, Jeff (Cal State Fullerton) and David Bowman (Cal State Fullerton)

Accelerated Moment Release (AMR) has been proposed as a potential medium to long-term predictor of large earthquakes. Previous studies have shown that earthquakes in California with magnitudes greater than 6.8 are preceded by an acceleration of intermediate-sized events. We define a false alarm as a period of accelerated regional seismicity that does not culminate in a large earthquake. For AMR to be useful as a predictive tool, acceleration must precede all large earthquakes and the false alarm rate must be small and known. Bowman et al. [1998] and Ikeda [2004] predicted AMR false alarm rates based on a synthetic earthquake catalogs. This study investigates the false alarm rate by using real earthquake catalogs to search for AMR signals across a broad region of California during a period of minimal earthquake activity. The AMR search is conducted on a grid covering the entire State of California. At each grid node a series of AMR calculations is performed to determine if AMR exists within circular regions over any temporal and spatial window. We construct false alarm rate curves base on the number of observed historical AMR sequences that did not culminate in a large earthquake. This provides a baseline for evaluating the utility of accelerating seismicity as a precursory signal before large earthquakes in California.

SCEC/UseIT: Bringing Meaning to the DataRobertson, Randy (USC)

UseIT is a multilateral internship created by the Southern California Earthquake Center to provide a cohesive and productive learning experience by integrating the disparate disciplines of Geology and Computer Science. This summer we were presented with the seemingly insurmountable challenge of creating an Earthquake Monitoring System with software. We were given access to a non-comprehensive set of existing software, and with much positive encouragement basically set free. Throughout the summer I have been primarily involved in the increasing the usability of the SCEC-VDO application. A big part of this was making the program more interactive through implementing features such as the ability to click on and highlight three dimensional faults and earthquakes to identify on the fly the important characteristics of both. While the basic functionality for selecting faults programmatically was preexistent, I found the need to put much work into making the process

178

Page 176: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

transparent to the user and less cumbersome. Furthermore, I implemented an Earthquake display subsystem to allow for this type of interaction without sacrificing performance. I also volunteered to tackle the mini-grand challenge presented to us as a group by Ken Hudnut which involved bringing an external data set of San Andreas LIDAR data into SCEC-VDO. Although this seemed impossible due to the sheer quantity of data, I was quickly able to prototype a LIDAR Plug-in which visually represented his data within the afternoon. This was particularly interesting because it allowed us to instantly compare the accuracy of a subset of the Community Fault Model with actual georeferenced surface trace data.

SCEC/UseIT: Web Services and Network Data in SCEC-VDORobinson, Thomas (USC)

The UseIT internship program is a chance for undergraduates to become familiar with various aspects of earthquake information technology. During summer 2005, interns accomplished this by working on SCEC-VDO, a program for the visualization of earthquake related datasets. The summer goal was to create an “earthquake monitoring system” which would allow anyone to view and manipulate relevant data to gain a better understanding of a sequence of unfolding events.

One key component of a near real-time monitoring system is quickly getting the most recent data, such as the last week of earthquakes, which the internet is perfectly suited for. We can now connect to a server at Cal Tech and programmatically bring Southern California earthquake data into SCEC-VDO, parse it, and display it. While this is especially useful for recent earthquakes, the same system can be used for retrieving historic earthquake data.

In addition to earthquake data, in the future we would like to be able to get other data from the internet, such as satellite images and city locations. While this type of data is rarely updated, the datasets are too large to download entirely, thus accessing just what we need would be ideal.

Currently many of these services use their own method of accessing data, however there are standard ways which would make it easier to add new functionality without having to write redundant code. Hopefully these standards will be adopted by everyone to make it easier to programmatically access these data. This ability to access data from the internet within SCEC-VDO will directly benefit the users in a variety of ways. In the future we hope to add more of these web services.

Accurate and Explicit Modeling of a Planar Free Surface Boundary Condition by Mimetic Finite Differences

Rojas, Otilio (SDSU), Jose Castillo (SDSU), Robert Mellors (SDSU), and Steven Day (SDSU)

Nowadays, popular fourth-order staggered grid finite difference schemes for the elastic wave equation implicitly implement zero-traction boundary conditions by using symmetry conditions, vacuum formulations, or, another approaches, that require grid points located beyond the physical boundary (ghost points). In this work, a new set of numerical differentiators known as “mimetic” finite differences have been used to explicitly solve the exact boundary conditions and fully compute the displacement vector along a planar free surface without any ghost points. The typical rotated staggered grid(RSG) and a classical Cartesian grid(CSG) have been extended by the inclusion of Compound nodes along the free surface boundary to allow this discretization process and place the displacement vector. Thus, two new algorithms are proposed here, one that works over a RSG, and the other one implemented using a CSG. Accuracy of these solvers is measured in terms of the

179

Page 177: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

dispersion of the numerical Rayleigh wave, and comparisons against established fourth-order algorithms are presented.

SCEC/UseIT: SCEC-VDO Scripting and Movie-MakingRousseau, Nick (PCC)

In UseIT, many improvements have been made to enhance the capabilities of moviemaking within SCEC-VDO. My primary research was to achieve ways in communicating scientific earthquake visualization concepts by creating animated movies from SCEC-VDO. While utilizing two proprietary softwares ‘AVID XPRESS’ and ‘Camtasia,’ easier movie effects have been adapted for a more comprehensible-viewing movie. “Seven Days” and “Yucaipa Quake” are two movies I scripted from SCEC-VDO and brought into AVID where they are enhanced and compressed into a QuickTime Movie. Both movies are an example of the three step processes I used to reach the final outcome of moviemaking. By using almost every aspect of SCEC-VDO during this process, I have given input for software improvement also serving as a principal tester for its continual development. In turn, I have assessed functionalities concerning navigation, rendering, scripting, and fault colors to SCEC-VDO during the summer program.

Full-Waveform Earthquake Location and the Mechanics of Streaks on the Calaveras Fault

Rubinstein, Justin L. (Stanford) and Gregory C. Beroza (Stanford)

Recent advances in earthquake location techniques and their application to large waveform datasets have uncovered a number of previously obscured seismic phenomena. These phenomena have the potential to provide new insight into the physics of faulting and earthquakes. Seismic streaks are one of these phenomena, and we have developed a new method of earthquake location to study them. Our technique is based upon standard array analysis methods, but applied in the reciprocal geometry to earthquake sources instead of the more typical receiver array analysis. Using array techniques allows us to use information contained throughout the entire waveform, unlike traditional location techniques that rely solely on measurements of the direct P and S arrivals. This allows us to determine centroid locations for earthquakes previously unlocated or poorly located by standard methods due to clipping or lack of data. It also has the potential to produce high-resolution earthquake locations for similar events even with relatively few stations. Here, we apply this method to medium magnitude earthquakes (M3.5-M5) near streaks on the Calaveras fault. We find that that these earthquakes initiate within the seismic streaks, and rupture into regions generally devoid of microseismicity. From this, we infer that seismic streaks represent a rheological boundary between creeping and locked sections of a fault.

Earthquake Forecasting with Numerical Simulations -- Virtual CaliforniaRundle, Paul (Harvey Mudd), Don Turcotte (UC Davis), Gleb Morein (UC Davis),

John Rundle (UC Davis), and Andrea Donnellan (JPL)

In 1906 the great San Francisco earthquake and fire destroyed much of the city. As we approach the 100 year anniversary of this event, a critical concern is the hazard posed by another such earthquake. In this paper we examine the assumptions presently used to compute the probability of occurrence of these earthquakes. We also present the results of a numerical simulation of interacting faults on the San Andreas system. Called Virtual California, this simulation can be used to compute the times, locations and magnitudes of simulated earthquakes on the San Andreas fault in the vicinity

180

Page 178: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

of San Francisco. Of particular importance are new results for the statistical distribution of interval times between great earthquakes, results that are difficult or impossible to obtain from a purely field-based approach.

DataFinder: Semantically-Informed Search Using MetadataRuss, Thomas (USC/ISI) and Hans Chalupsky (USC/ISI)

DataFinder provides semantic overlays for existing metadata attributes, enriching the information content. The DataFinder system translates the domain concepts specified by the user into SCEC/CME metadata attributes. The end result of data discovery is that the SCEC/CME system users can identify the logical file names of the files needed to execute a workflow that solves their geophysical problem. DataFinder also allows users to locate end data products using domain-level descriptions instead of program-specific and varied metadata attributes.

DataFinder is part of the SCEC Community Modeling Environment’s suite of tools for generating computational workflows and locating the resulting data products. The process of running SCEC computational models produces numerous data files. These files have descriptive metadata stored as pairs of attribute names and values. Depending on which software was used to prepare the files, different attribute names and different organizational schemes are used for the metadata. Search tools for this metadata repository rely on the user knowing the structure and names of the metadata attributes in order to find stored information. Matters are made even harder because sometimes the type of information in a data file must be inferred. For example, seismic hazard maps are described simply as “JPEGFile”, with the domain content of the file inferable only by looking at the workflow that produced the file. This greatly limits the ability to actually find data of interest.

The DataFinder semantic metadata search tool uses a concept-based domain and metadata attribute ontology that links geophysical and seismic hazard domain concepts with the metadata attributes that describe the computational products. The DataFinder domain and metadata attribute ontology is represented in the PowerLoom representation language, based on KIF . DataFinder is implemented using a hybrid reasoning approach based on combining the strengths of the PowerLoom logical reasoning engine with the database technology underlying the metadata repository.

The direct connection between DataFinder and the metadata repository database is achieved via PowerLoom’s powerful database access layer, which provides DataFinder with the necessary scalability to handle the task of locating relevant data products in a large repository. The marriage of database technology to handle large datasets, along with an expressive concept and rule language supported by the PowerLoom reasoning engine gives us both the scalability to handle the SCEC datasets as well as the inferential power to map different metadata attributes into a common set of domain terms. It also allows us to add semantic enhancements by overlaying the raw metadata with a hierarchy of concepts, providing more abstract views of the data collection.

Quaternary Faulting in the Inner Southern California Borderland, Offshore San Diego County, California

Ryan, H.F., J.E. Conrad, and R.W. Sliter (USGS)

Multichannel 2D seismic reflection (MCS) data collected by Western Geophysical in 1975 and recently made publicly available combined with high-resolution MCS, single-channel Huntec and Geopulse profiles collected by the USGS in 1999 and 2000 are used to address two key questions pertaining to offshore faulting in the southern California Borderland: 1) has the Oceanside detachment fault been reactivated as a thrust fault during the Quaternary, and 2) is the Palos Verdes fault zone continuous with the Coronado Bank fault zone south of La Jolla fan valley (LJFV)? The

181

Page 179: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Oceanside detachment fault is best imaged north of San Mateo Point, where it dips landward (eastward), and prominent folds deform hanging wall rocks. However, the age of these folds is unknown. In some areas they appear to be onlapped by flat-lying basin sediment. South of San Mateo point, the Oceanside detachment is not as well defined in the MCS data, however, a prominent strike-slip fault, the San Onofre-Oceanside fault (after Fischer and Mills, 1991), is observed near the base of the continental slope. Near the southern termination of this fault offshore of Carlsbad, there is another zone of folding near the base of the slope. This zone of folding is coincident with the intersection of a narrow subsurface ridge that trends at a high angle and interacts with the margin. Recent motion of the Oceanside detachment as a thrust fault therefore appears to be limited to the area between Dana and San Mateo Points, and offshore of Carlsbad.

The Coronado Bank fault zone (CBFZ) has generally been mapped as a steeply dipping, NW-trending zone consisting of multiple strands that extend from south of the border to offshore of San Mateo Point. South of LJFV, the CBFZ is primarily transtensional and appears to terminate at the LJFV in a series of horsetail splays. Whether the CBFZ continues north of LJFV is problematic. North of the LJFV, the CBFZ forms a positive flower structure that can be mapped at least as far north as Oceanside. However, north of Oceanside, the fault zone is more discontinuous than to the south and in places, there is no strong physiographic expression of faulting. This contrasts with the Palos Verdes fault zone north of Lasuen Knoll, which shows clear evidence for recent faulting. Therefore, although the northern segment of the CBFZ may connect with the Palos Verdes fault zone, it does not appear to have a similar style of deformation and suggests that some of the net slip between LJFZ and Lasuen Knoll may be transferred to faults other than the CBFZ.

LIDAR measurements of fault roughnessSagy, A., G. J. Axen,

Emily Brodsky (University of California, Los Angeles)

We present our first results after analyzing the roughness of fault surfaces using ground-based LiDAR (Light Detection and Ranging). The measuring system can scan up to hundreds of squares meters with individual points spaced as close as 3mm apart. Depending on the size of the area, we can then extract thousands of fault-surface profiles in any direction along its scanned surface.

Measurements of strike-slip and normal fault surfaces suggest that an individual coherent striated slip surface is relatively smooth at any given wavelength, with respect to other natural surfaces in its surroundings. Moreover, profiles along the direction of the displacement are smoother than perpendicular profiles, and this anisotropy enlarges with the profile length. These measurements, together with field observations which indicate that more than one slip direction is observed along given fault segments, suggest that episodic displacement of rigid rock bodies took place along distinct surfaces during earthquakes.

Large active faults that we sampled contain an ensemble of such striated surfaces. The roughness characteristic of these complicated fault zones is dissimilar to that of a separated segment and reflects both fracture processes which “roughen” the surface and wear processes which “smooth” it.

Fault Zone Formation and Secondary Seismic Radiation From Off-Fault Damage

Sammis, Charles G. (USC)

Rice et al. (BSSA, 2005) developed an analytic model for a dynamic 2D self-healing slip pulse that propagates at constant velocity. The stress field generated by this model was used to calculate off-

182

Page 180: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

fault fracture damage according to the micromechanical damage mechanics model formulated by Ashby and Sammis (PAGEOPH, 1990). We thus extend the Rice et al. (2005) analysis to quantify the widths of both the gouge zone and of the zone where wall-rock damage is above regional background. Because the Ashby-Sammis damage mechanics models the slip and extension of preexisting fractures, it is possible to formulate a shear and tensile moment tensor for each initial fracture, and thus calculate the associated seismic radiation. While the radiation from each fracture is weak and very high frequency, the amplitude of the coherent summed radiation from the region around a propagating rupture front is significant and has a dominant frequency near 20 Hz.

Spatiotemporal Bayesian Analysis for Integration of GPS and DInSAR DataSamsonov, S. (UWO), K.F. Tiampo (UWO), and J.B. Rundle (UC Davis)

Recent work in the field of hierarchical Bayesian modeling in the atmospheric sciences has resulted in development of methods for the successful integration of spatially sparse but temporally dense data with data collected on a spatially dense grid at intermittent times (Wikle et al., 1998). Here we present a method adapted from this theory for the derivation of three-dimensional surface motion maps from sparse GPS measurements and two DInSAR interferograms using Gibbs-Markov random fields equivalency within a Bayesian statistical framework (Gudmundsson and Sigmundsson, 2002; Li, 2001; Samsonov and Tiampo 2005). It can be shown that the Gibbs energy function can be optimized analytically in the absence of a neighboring relationship between sites of a regular lattice and because the problem is well posed, its solution is unique and stable. The results of inverse computer modeling are presented and show a drastic improvement in accuracy when both GPS and DInSAR data are used. Preliminary results are presented using Southern California Integrated GPS Network (SCIGN) data and DInSAR data from the Western North America Interferometric Synthetic Aperture Radar (WInSAR) archive.

Recurrence Behavior of the Southern San Andreas FaultScharer, Katherine (Appalachian State University), Glenn Biasi (UNR),

Ray Weldon (OU), and Tom Fumal (USGS)

We present a 28 paleoearthquake record at the Wrightwood, CA paleoseismic site and evaluate the chronology for recurrence patterns. We focus on three issues of the chronology: (1) missing earthquakes, (2) over-interpretation of evidence, and (3) record length. We show that the length of the Wrightwood record is long enough to mitigate the impact of over interpretation on recognizing the underlying recurrence behavior.

The chronology consists of two separate records extending from the historic 1857 earthquake back to ~500AD (the “young record”) and from BC1500 to BC3000 (the “old record”). Both records express geologic evidence for a maximum of 14 earthquakes during their respective time span. Fourteen each, however, is not an absolute number because the stratigraphic and structural setting does not provide an infallible recorder. In the young section, an earthquake documented at Pallett Creek to the NW and Pitman Canyon to the SE was not observed at Wrightwood. In the old section we are able to evaluate the effects of over-interpretation of the record, that is ascribing “earthquake status” to evidence that is equivocal. If more

stringent evidence of earthquakes is required, there may be as few as 11 earthquakes in the old section.

Recognizing uncertainty in the paleoseismic records, all of the tests and computations compare the young, old, and combined records, in all their permutations, for consistency and patterns. We find that

183

Page 181: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

inclusion or exclusion of questionable earthquakes can have moderate effects on the individual records but does not notably affect the results of any combined series.

We use two non-parametric tests for periodicity and clustering developed in Biasi et al. (2002) to evaluate the recurrence patterns of a range of chronologies. Notably, the old record is more regular than the younger: statistically stated, at the 80% confidence level, 70-99% of any chronology is too regular to be likely to have come from a Poisson process. We see zero to minimal evidence of clustering in the old record compared to the allowable variability of a Poisson process. All of the possible combined records are more regular than a Poisson process: ~90% of the tests are unlikely to result from a Poisson process at the 80% confidence level. In summary, we find with the Wrightwood record, stable estimates of recurrence parameters can be obtained even when the exact membership of the event series is unknown or uncertain.

On average, the young and old records are similar. Modeled as a Poisson process, the most likely recurrence interval is between 100 and 140 years (95% confidence limits range from 60 to 300 years). For the combined record, the most likely value is ~105 years and the 95% range is reduced to 70 to 170 years. An informative display of the recurrence data is to each recurrence interval pdf within a chronology into one averaged record. This combined record creates a multimodal distribution with a plateau from 60 to 150 years. In the 148th year of the current interseismic period, we are moving into the upper tail of the combined record.

The Relationship between the Surface Expression of Blind Thrust Faults and Crustal-Scale Deformation in the Eastern Precordillera, San Juan, Argentina

Schiffman, Celia R. (OSU) and Andrew J. Meigs (OSU)

Large earthquakes (M w ? 6.5) are often accompanied by surface rupture that has a predictable relationship with the magnitude. However, in large thrust earthquakes that have a deep (30+ km) hypocenter or fault tip, coseismic surface deformation is expressed primarily by folding rather than as rupture along the fault surface. Knowledge of source characteristics and surficial geology are required to characterize the relationship between earthquake fault slip and coseismic folding (see Yeats, Berberian, and Xiwei poster, this conference). By fully identifying and characterizing the fault plane of the M w 7.4 earthquake that occurred in 1944 in the eastern Precordillera of the Andes, destroying the city of San Juan in northwestern Argentina, we seek to relate active folding in the near-surface structures to the blind-thrust fault at depth. Coseismic deformation associated with the 1944 earthquake are secondary fault-related folding features, and there is a large discrepancy between the amount of surface rupture and the magnitude. Subtle fold-related clues at the surface represent the only potential for recognition of the occurrence of past earthquakes. This two-part study employs seismology and structural mapping to provide a new image of the Eastern Precordillera at the crustal scale.

Source parameter inversion of teleseismic seismograms from the 1944 event place the hypocenter on a west-dipping plane approximately 30 km deep, which has previously been defined by microseismicity, as opposed to a surface-rupturing event in the Neogene sedimentary strata. Preliminary results from field mapping show two types of folding due to a west-dipping thrust fault with a tip at 5 km depth: a broad long-wavelength fold (~8 km) in deformed strath terraces cut into previously deformed bedrock, and short wavelength folding and faulting in the bedrock in the form of reactivation of older thrust planes. As of now, we cannot uniquely tie any one of these surficial structure to the thrust fault at depth because the pre-existing deformation in the bedrock acts as a filter, and the deformation is distributed between both the bedrock and the terraces. A further complication is how the deformation is distributed over time: both the terraces and the reactivated flexural slip faults show differential amounts of slip over time with respect to terraces of different ages. The most recent deformation is located in a narrow band in the southeast.

184

Page 182: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Hopefully our results will help lead to a method for relating secondary surficial information on a blind thrust fault to conclusive data on paleoseismicity. Results indicate that the Eastern Precordillera has been active throughout the Holocene due to the differential deformation in terraces of varying ages, and probably presents a greater seismic hazard than previously believed.

Initiation Propagation and Termination of Elastodynamic Ruptures Associated with Segmentation of Faults and Shaking Hazard

Shaw, Bruce (Columbia)

Using a model of a complex fault system, we examine the initiation, propagation, and termination of ruptures, and their relationship to fault geometry and shaking hazard. We find concentrations of epicenters near fault stepovers and ends; concentrations of terminations near fault ends; and persistent propagation directivity effects. Taking advantage of long sequences of dynamic events, we directly measure shaking hazards, such as peak ground acceleration exceedence probabilities, without need for additional assumptions. We find some significant aspects of the shaking hazard can be anticipated by measures of the epicenters.

Statistical Properties of Aftershock SequencesShcherbakov, Robert (UCD), Donald L. Turcotte (UCD), Gleb Yakovlev (UCD), and

John B. Rundle (UCD)

We consider the aftershock statistics of several well documented aftershock sequences in California and elsewhere including the Parkfield earthquake. Three statistical distributions are studied: 1) Gutenberg-Richter frequency-magnitude statistics, 2) the modified Omori's law for the decay rate of aftershocks, 3) Recurrence interval statistics between successive aftershocks. We find that there is good agreement with our proposed generalized Omori's law if the characteristic time c is strongly dependent on the magnitude cutoff and the main shock magnitude of an aftershock sequence is considered. We also find that the recurrence interval statistics are explained by a non-homogeneous Poisson process driven by the modified Omori's law. The analytically derived distribution of recurrence times is applied to several major aftershock sequences in California to confirm the validity of the proposed hypothesis.

The 2001 and 2005 Anza Earthquakes: Aftershock Locations and Source-Time Functions

Shearer, Peter, Guoqing Lin, German Prieto, and Frank Vernon (IGPP/SIO/UCSD)

The recent M 5.2 earthquake (June 12, 2005) in the Anza region was preceded by a M 5.1 event (October 31, 2001) about 7 km to the southeast. Both earthquakes have similar focal mechanisms and occurred near 17 km depth in the San Jacinto fault zone within the Anza seismic gap, which has not experienced a major earthquake for at least 100 years. We examine both the aftershock sequences and source-time functions for these events. To study the aftershocks, we relocate 22,056 earthquakes between 1984 and June 2005 in the Anza region using source-specific station terms and waveform cross-correlation. Aftershocks for the 2001 earthquake are very compact and form a ~2 km cluster just south of the mainshock. The 2005 aftershocks are more distributed but lie mostly within a wedge of seismicity extending about 6 km southeast of the mainshock. A small number of aftershocks extend for tens of kilometers to the northwest along the San Jacinto fault. The aftershocks all occurred within regions of ongoing background seismicity. We apply empirical Greeen’s function

185

Page 183: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

deconvolution to strong motion records from the Anza network to resolve the mainshock P and S displacement pulses. Both earthquakes have shorter pulses to the north, suggesting northward rupture propagation. For each earthquake, we hope to use these directivity results to discriminate between the two possible mainshock fault planes: a near-vertical northeast striking plane and a northwest striking plane that dips to the northeast. The northwest striking plane is parallel to the San Jacinto fault; however, Hauksson et al. (2002) suggested that 2001 earthquake occurred on the northeast striking plane. Assuming that the ruptures for both the 2001 and 2005 earthquakes propagated to the north, it is puzzling why most of the aftershocks occurred south of the mainshock hypocenters.

Comprehensive Analysis of Earthquake Source Spectra in Southern California

Shearer, Peter, German Prieto (SIO/UCSD), and Egill Hauksson (Caltech)

We compute and analyze P-wave spectra from earthquakes in southern California between 1989 and 2001 using a method that isolates source, receiver and path dependent terms. We correct observed source spectra for attenuation using both fixed and spatially varying empirical Green's function methods. Estimated Brune-type stress drops for over 60,000 M = 1.5 to 3.1 earthquakes range from 0.2 to 20 MPa with no dependence on moment or b-value. Median stress drop increases with depth in the upper crust, from about 0.6 MPa at the surface to about 2.2 MPa at 8 km, where it levels off and remains nearly constant in the mid-crust down to about 20 km. Normal fault earthquakes have a higher median stress drop than strike-slip or reverse fault events. Spatially coherent variations in median stress drop are observed, with generally low values for the Imperial Valley and Northridge aftershocks and higher values for the eastern Transverse ranges and the north end of the San Jacinto fault. We find no correlation between observed stress drop and distance from the San Andreas and other major faults. Significant along-strike variations in stress drop exist for aftershocks of the 1992 Landers earthquake, which may correlate with differences in mainshock slip.

Updates to the Crustal Motion Model: A Progress ReportShen, Zhengkang (UCLA), Robert W. King (MIT), Duncan Carr Agnew (UCSD), and

Mark Murray (UC Berkeley)

We report on our progress on extending the SCEC Crustal Motion Map (CMM) to include additional data in the SCEC area, and also on its extension to cover a wider area of California. We have archived additional data (some new and some previously collected but not used) for the area of the original CMM, including new sites in Baja California, the Eastern California Shear Zone, the San Bernardino region, and the Coachella Valley, as well as surveys made around the San Simeon and Parkfield earthquakes. At the time of writingall these data have been processed, and we are currently combining the results of daily processing to check for consistency. Subsequent combinations will include all continuous GPS stations, including a large number of SCIGN sites which did not have enough data to be included in the previous CMM. To extend the results to Northern California, we will be combining the results of the BAVU processing, which covers the San Francisco Bay region, and data from the Mendocino area. To provide a preliminary model for use for statewide modeling, we will be combining velocity fields from a number of other analyses.

186

Page 184: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

SCEC/UseIT: Bridging the Gap: Programmers and Users, Users and DataShepherd, Lori (Canisius College)

Visualization in a 3D environment is a vital aid in understanding and conveying informational data concerning geosciences, especially earthquakes. This summer’s Grand Challenge was to engineer 3D visualization software that would benefit scientists, public services and the media, as well as the general public, by utilizing the software to monitor earthquake sequences and create animated movies about them. Although proprietary and less comprehensive visualization tools are available, the SCEC/UseIT intern group focused this summer’s project on continued development of a more powerful and user friendly visualization tool, SCEC- Virtual Display of Objects (SCEC-VDO).

SCEC-VDO has been enhanced by improving the versatility of the plug-in that enables users to import, query, and display earthquake catalogs. For the first time, users of SCEC-VDO can now import parametric data in two common catalog formats, the SCEDC and Hypo71 formats. The Southern California Earthquake Data Center (SCEDC) contains data from the Southern California Seismic Network, the longest running network.

Although coding to improve and build upon existing Java code is beneficial, another important aspect of this summer’s project was serving as a liaison between intern coders and users involved in the movie-making process. Movie scripts were created to experiment with new functionalities added by other interns, to investigate the software for errors, and to generate new ideas and functionality that would be advantageous for the user. This area of the project bridges the gap between the program coders’ ideas and the users’ needs. This connection between program coders and program users is vital in the advancement of SCEC-VDO; giving the programmer a sense of necessities users require, as well as how user-friendly the current version is.

Dynamic Rupture on a Bimaterial Interface Governed by Slip Weakening Friction

Shi, Zheqiang (USC) and Yehuda Ben-Zion (USC)

We perform 2D finite-difference calculations of mode II rupture along a bimaterial interface governed by slip-weakening friction, with the goal of clarifying rupture properties in such cases and the conditions leading to the development of unilateral wrinkle-like pulses. The simulations begin with an imposed bilateral rupture in a limited source region. Rupture properties outside the imposed source are examined for ranges of values of the degree of material contrast across the fault, the difference between static and dynamic coefficients of friction, and the difference between static friction and initial shear stress. The results show that mode II rupture evolves with propagation distance along a bimaterial interface, for broad ranges of realistic conditions, to a unilateral wrinkle-like pulse in the direction of slip on the complaint side of the fault. These conditions span in our calculations the ranges fs-fd < 0.5 and material contrast larger than 2-5%. When the difference between the static friction and initial shear stress is smaller, the evolution to unilateral wrinkle-like pulses occurs for smaller values of material contrast. The amount of slip increases with propagation distance, due to the incorporation of slip-wakening friction, in contrast to earlier results based on Coulomb and Prakash-Clifton friction laws with slip-independent coefficient. In all cases leading to wrinkle-like pulses, the rupture velocity in the preferred (+) propagation direction is very close to CGR, where CGR is the generalized Rayleigh wave speed. Simulations with imposed rupture speed in the source region close to the slower P wave speed can excite, in addition to the primary wrinkle-like pulse in the preferred direction with speed near CGR, a weak pulse in the opposite (-) direction with speed close to the slower P wave speed. In some cases leading to bilateral crack-like propagation (e.g., fs-fd = 0.7), the rupture velocities in the opposite directions are faster P wave speed in the positive direction and slower P wave speed in the negative direction, with the initial supershear crack front in the (+) direction followed by a pulse with speed close to CGR.

187

Page 185: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

San Fernando Valley High School Seismograph Project Simila, Gerry (CSUN)

Following the 1994 Northridge earthquake, the Los Angeles Physics Teachers Alliance Group (LAPTAG) began recording aftershock data using the Geosense PS-1 (now the Kinemetrics Earthscope) PC-based seismograph. Data were utilized by students from the schools in lesson plans and mini-research projects. Over the past year, several new geology and physical science teachers are now using the AS-1 seismograph to record local and teleseismic earthquakes. This project is also coordinating with the Los Angeles Unified School District (LAUSD) high school teachers involved in the American Geological Institute’s EARTHCOMM curriculum. The seismograph data are being incorporated with the course materials and are emphasizing the California Science Content Standards (CSCS). The network schools and seismograms from earthquakes in southern California region (2003 San Simeon, 2004 Parkfield) and worldwide events (e.g. Alaska 2002; Sumatra 2004,2005) are presented. In addition, CSUN’s California Science Project (CSP) and Improving Teacher Quality Project (ITQ) conduct in-service teacher (6-12) earthquake workshops.

Potentially Pulverized Granites Along the Garlock Fault: An Analysis into their Physical and Chemical Properties

Sisk, Matthew (SDSU), Thomas Rockwell (SDSU), Gary Girty (SDSU), Ory Dor (USC), and Yehuda Ben-Zion (USC)

We collected samples of apparently pulverized granitic rock along three transects from trenches across the Garlock fault on Tejon Ranch. Our primary purpose was to establish whether pulverized granites occur along the Garlock fault by study of the chemical and physical properties of these finely-powdered rocks, or whether their apparent pulverization is due to other processes such as weathering from fluid circulation along the fault zone. In each transect, the granitic rock appears more pulverized (finer-grained) immediately adjacent to the active fault. Macroscopically, one observes feldspar and quartz grains that apparently represent the original grain of the rock, with individual grains to 1-2 mm. In hand sample, the rock is so powdered that it is difficult to collect large, intact samples. Further, the visible quartz and feldspar grains smear to rock flour when pressed between the fingers.

We have completed major and trace element analyses of all samples which allows for Chemical Index of Alteration (CIA) values, and begun analysis of their particle distributions, whole rock and grain density, mineralogy by XRD, and physical appearance through thin section and SEM. Collectively, the properties determined from these methods will resolve whether chemical weathering has played a significant role in the reduction of grain size of these samples. To date, the major element chemistry (CIA values) indicates that the finely-powdered granitic material has an identical mineral assemblage as fresh, unweathered granite, and that there is no indication of chemical weathering. These preliminary observations indicate that these granitic rocks are indeed pulverized by mechanical processes associated with slip along the Garlock fault.

GPS Data Collection and Forward and Inverse Fault Dislocation ModelingSkalenakis, Adam (SCEC/SURE Intern, Harvey Mudd College),

Gregory Lyzenga (Harvey Mudd College), Amanda Lopez, Sally McGill, and Joan Fryxell (CSUSB)

Using GPS position data we collected over a two-month period at several sites in the San Bernardino Mountains, we will combine our results with previous data to determine each point’s individual

188

Page 186: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

velocity. Using analytic elastic dislocation software for determining point motion near a fault, Disloc I will first experiment with various fault motion characteristics to obtain approximate fits to the geodetic velocities.

Secondly, in a more analytical manner I will utilize an inversion program based on the downhill simplex algorithm to rigorously obtain the best-fitting fault parameters and motion for the region based upon the data we have collected and that of previous years.

Physics of Rate and State Friction Evolution LawsSleep, Norman H (Stanford)

Rate and state evolution equations represent the combined effect of damage from sliding and healing on the state variable. The Ruina [1983] evolution equation implies that the state variable does not change (no healing) during holds when sliding is stopped. It arises from exponential creep within a gouge when the concentrated stress at asperities scales with the macroscopic quantities. The parameter b-a represents the tendency of microscopic asperities accommodating shear creep to persist longer than asperities of compaction creep at high sliding velocities. In the Dieterich [1979] evolution law, healing occurs when the sample is at rest. It is a special case where shear and compaction stress asperities occur at different microscopic locations at a subgrain scale. It also applies qualitatively for compaction at a shear traction well below that needed for frictional sliding. Chemically, it may apply when shear sliding occurs within weak microscopic regions of hydrated silica while compaction creep occurs within anhydrous grains.

How 3D Heterogeneous Stress Changes Our Interpretation of Focal Mechanisms and Apparent Stress Rotations

Smith, Deborah Elaine (Caltech) and Thomas H. Heaton (Caltech)

We are extending stress modeling to explicitly include spatially heterogeneous stress in three dimensions. Our first observation is that if there is significant spatial heterogeneity, then popular focal mechanism inversions yield orientations biased towards the stressing rate not the stress. This occurs because when spatial heterogeneity is coupled to a stressing rate, the simulated region no longer becomes an unbiased, completely random sampler of the stress. The points that preferentially fail and are included in the inversions are those with stress orientations that on average align with the stressing rate orientation. This suggests that if stress in the real Earth is also spatially heterogeneous, which we have reason to believe it is, then standard stress inversion studies need to be reinterpreted to include this bias toward the stressing rate.

The stressing rate can be any time perturbation to the region. Candidates include: 1) the build-up of stress on locked faults driven from far-field plate-tectonic motions, 2) post-seismic transients after a major earthquake, and 3) slow earthquakes.

We define “significant” spatial heterogeneity by comparing the size of the heterogeneous stress to the spatially homogeneous stress (spatial mean). This ratio, which we will call, Roughness, describes how heterogeneous the system is. Observations of fault thickness, rupture velocity, and radiated energy indicate average stresses of less than 20 MPa, when averaged over dimensions of 10s of kms; therefore, we can parameterize our spatially homogeneous stress as < 20 MPa. At the same time, rock mechanics experiments indicate that stresses averaged over meters should exceed 100 MPa to nucleate earthquakes; therefore, short wavelength spatially heterogeneous stress could exceed 100 MPa. Using these two observations, it is possible that some regions could have a, Roughness of about 5, which yields a 0.65 bias. We measure bias as follows. If the total angular

189

Page 187: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

difference between the average stress orientation and the stressing rate orientation is omega (i.e., rotate through an angle omega about some rotation pole), then a bias of 0.65 means that the inverted tensor orientation is a little over half-way between the average stress and stressing rate orientations, a rotation of 0.65*omega from the average stress orientation. In some regions, we estimate the Roughness may exceed 10, which yields a bias >= 0.9. At this level of bias, one cannot determine the orientation of the average stress, because the inverted tensor orientation is so closely aligned with the stressing rate orientation.

This biasing effect also has important implications for the observed “stress rotations” after a mainshock. If heterogeneity is significant, then the orientations of the aftershocks will on average align with the perturbation to the system, not the background stress + perturbation. We find: 1) Even if the background stress (spatial mean) is large compared to the stress perturbation, one could still find a measurable rotation of average focal mechanism orientations. 2) The average orientation of focal mechanisms will rotate back to the pre-mainshock orientation once the post-seismic transients decrease below the level of the plate-tectonic stressing rate.

Using Maya for Earthquake-Related VisualizationsSmith, Jeremie Ledesma (SCEC/SURE Intern, USC)

I have been using the 3D animation program, Maya, to create visualizations for scientists here at Scripps. My will model and animate a wireless optical seismometer. I will establish a framework for a “GeoMaya” by writing code that will import Digital Elevations Models from various programs into Maya. I will aslo create a 3D visualization that shows the growth of sensors and corresponding growth of high-speed networks in the world to see if there is relationship between the two. Lastly, I will use my skills in Maya to model 3D objects that can be imported into Fledermaus to reference earthquake data sets.

SCEC/UseIT: Understanding Fault Hazard in CaliforniaSolomon, James (PCC)

SCEC/UseIT is an internship focused on combining computer science and seismology. Our goal this summer was to create an “Earthquake Monitoring System”, a 3-d or 4-d interactive virtual universe, showing seismic datasets and other pertinent information. We improved an existing system, created last summer by UseIT: SCEC-VDO (SCEC Virtual Display of Objects). Our software allows scientists to quickly create movies demonstrating many kinds of seismological information.

My team focused on expanding the fault-display abilities of SCEC-VDO. Our project centered around two datasets: the USGS 1996 and USGS 2002 models, released by the NSHMP (National Seismic Hazard Analysis Project). Both models are simplified rectangular faults, used for probabilistic seismic hazard analyses. However, our USGS 2002 display is more dynamic, showcasing the two different ways of gridding faults used by OpenSHA. The rectangular, or Frankel method, is identical to the USGS 1996 models in dip averaging; the corrugated, or Stirling method, uses a more localized dip average, reducing intersection and divergence within an individual fault representation.

We have also implemented a system for programmatic download of fault models. Currently, we can retrieve OpenSHA representations of the USGS 2002 dataset. However, our software allows us to easily bring in any fault representation in the OpenSHA gridded-surface format, including ones which have not yet been created.

190

Page 188: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

The San Andreas and North Anatolian Faults Compared by SCEC and a Consortium of Institutes in Turkey

Sorlien, Christopher (UCSB), Marie-Helene Cormier (LDEO), Leonardo Seeber, (LDEO), Thomas Jordan (USC), Naci Gorur (Istanbul Technical University), Craig

Nicholson (UCSB), and Sarah Hopkins (UCSB)

SCEC has developed a collaborative research agreement with several research and governmental institutions in Turkey, which include: Istanbul Technical University (ITU), General Directorate of Mineral Research and Exploration (MTA), and Kandilli Observatory and Earthquake Research Institute of Bosphorus University. This agreement builds on many previous and ongoing research collaborations between US and Turkish scientists. The focus of this collaboration is a comparison between the North Anatolian Fault (NAF) and the San Andreas Fault (SAF). Both faults are major, right-lateral continental strike-slip faults with similar lengths and slip rates. Both faults branch into broader systems towards southern California and towards northwest Turkey, containing sub-parallel faults, and active basins and ridges.

One of the most exciting results of SCEC-funded research has been to show that faults become more active for a period of a few hundred or a thousand years, and that these periods of enhanced activity switch between sub-parallel faults or groups of faults (e.g., Dolan et al, 2004, Oskin and Iriondo, 2004). This is reflected in a striking discrepancy between geodetic and Holocene slip rates. An offset latest Pleistocene channel suggests long-term slip on the north branch of the NAF (NAF-N) is much slower than the pre-1999 GPS rate (Polonia et al., 2004). Perhaps because of reactivation of older subduction structures first in extension and later by strike-slip, both areas display non-vertical strike-slip faults and large fault bends, which result in rapid vertical motions, and either extension or contraction. New models suggest that strike-slip motion can be quantified using vertical motions and either extension or contraction between double bends in these faults. In Marmara Sea adjacent to the NAF-N, progressive tilt is uniform versus depth through the Holocene section, suggesting uniform slip rate on the NAF at least through this period. Such determinations are dependent on precise stratigraphic age control giving the sedimentation rate through time and space. Continuous deposition and progressively-tilting fold limbs are also common in the near-offshore of southern California, and precise dating can be used to test whether deformation is episodic on the tens to hundreds of ka time scales. Regional correlation of 1 Ma and younger dated horizons, and intervening sequence boundaries associated with glacial cycles, is used to estimate slip continuity on the partially blind Oak Ridge fault.

A Comparison of the Logging Methods for a Trench across the San Andreas Fault in the Carrizo Plain

Starke, Emily (SCEC/SURE Intern, University of Tulsa)

For my SCEC Internship I helped log two trenches across the San Andreas Fault at the Bidart Fan in the Carrizo Plain. The purpose of logging was to document evidence and ages of multiple surface ruptures from large earthquakes. I experimented with two methods of logging trench T5, with photo logs as well as hand logs, to compare which method is most useful for the Carrizo Plain. Due to the photography equipment used and characteristics of exposed materials, the hand logs conveyed a clearer picture of fault rupture. Since the Carrizo Plain sedimentation does not offer a variety of color, identifying evidence of rupture proved difficult with the photo logs. The hand logs were simpler because they had already been interpreted through drawings of the trench walls. Although this method is more subjective, the similar sedimentation of the bed layers renders the hand logs more valuable since it focuses the researcher to study the trench wall with a close eye. Using the best logging method for the Carrizo Plain is crucial since the logs are used to interpret the past behavior of the fault. Through this research, we hope to ascertain the best method for documenting past behavior of the San Andreas Fault in the Carrizo Plain.

191

Page 189: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

SCEC/UseIT: Smart Text LabelsStone, Matthew (DePauw University)

The SCEC/UseIT program unites undergraduates from varied interdisciplinary fields to create a highly-functional, computer-generated, 3D earthquake analysis environment named SCEC-VDO. My main point of focus within SCEC-VDO has been work on the plugin used for adding text labels to the environment, primarily on the ability to input labels from text files. Previously, the program had been able to import text files, but the required file format was very rigid. With a little help from a few other interns, I made it possible to have the necessary components within the file (text, latitude, and longitude) in any order desired, and added optional fields for altitude and font size. Other additions to the label plugin include standard text file displays and the California city search. The standard text file displays allow the user to toggle on and off a few useful built-in text files, such as major cities in California with their population, and SCEC institutions. The California city search allows the user to input a city in California and the program will automatically locate that city on the map. One of the main roadblocks in accomplishing this task was figuring out how to code the new method of inputting files. This new method requires a substantially larger amount of computation than the previous approach. Also, never having used Java before, I spent a decent amount of time learning the new language.

Evolution of a Rapidly-Slipping Juvenile Strike-Slip Fault: Lenwood Fault, Mojave Desert, California

Strane, Michael (UNC), Mike Oskin (UNC), Lesley Perg (UMN), andDylan Blumentritt (UMN)

The 65 km-long Lenwood fault is an important component of a system of right-lateral strike-slip faults that comprise the Eastern California shear zone across the central Mojave Desert. The northern Lenwood fault cuts through a distinctive Miocene section made up of both Barstow formation and Pickhandle formation. Within this section several piercing points establish the overall offset. Strongest of these is a thin volcaniclastic ash bed displaced 1.87 ± 0.09 km. The ash bed is found near the base of a sequence of interbedded, monolithologic landslide megabreccias that also match across the fault. Several partially offset markers allow a view of the neotectonic history of the Lenwood fault. An olivine-rich basalt uncomformably overlies the Miocene rocks and is offset up to 1.1 km. Because this offset is greater than half the total observed on the fault in the area, dating this will not only give a long-term slip rate but will also give insight into the inception of activity on the fault. Short-term slip rates of the fault will be determined from offset geomorphic surfaces. Many stream deflections are identified and offsets of a few to 10s of meters disrupt hillslopes and abandoned alluvial fans. The most important of the offsets occurs in the Meridian Wash section of the fault, where an alluvial fan surface (Tk) is offset 350 ± 10 m. Cosmogenic dating of the alluvial fan is in process. Similar surfaces in the Mojave Desert displaced by the Calico Fault are dated to 89 ± 6 ka, suggesting a preliminary slip rate of 3.9 ± 0.4 mm/yr for the Lenwood fault. Several structures give evidence of the linkage of distinct strands of the Lenwood fault through time. At the boundary between the Daggett Ridge and Stoddard Valley sections of the fault, a graben, 1.4 by 0.5 km in size, formed via fault slip across a dilational stepover. This graben is bounded by an inactive normal fault and both inactive and active strike-slip fault traces. Realignment of dextral faults has led to abandonment of normal faults bounding the graben. In the Stoddard Valley section of the fault a series of active folds deform an older surface (Tf). Truncated strata indicate that folding was active prior to emplacement of Tf. Geochronology of the Tf surface is in process, and will lend insight into the rate of formation of these folds. At the southern end of the Meridian Wash section of the fault, an area of highly distributed faulting within the Pickhandle formation indicates that an additional linkage zone may be forming with fault strands further south.

192

Page 190: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

The New SCEC Community Velocity Model (CVM-H 2.0)Suess, M. Peter, John H. Shaw, Peter Lovely, Joachim Mueller, and

Andreas Plesch (EPS, Harvard)

We present a new version of the SCEC Community Velocity Model (CVM-H 2.0), as an alternative to the standard SCEC CVM (Magistrale et al., 2000), that describes the seismic velocity (p- and s-wave) and density structure of the major Southern California sedimentary basins from northwest of Ventura to Baja California, including the offshore basins of the Western Borderland and the Salton Trough. The model is a hybrid defined by high-resolution velocity data sets from industry reflection seismic surveys, geophysical well measurements, and analytic velocity functions derived from well and refraction seismic data . The model is embedded into the regional tomographic model of Hauksson (2000). The CVM-H consists of several higher and lower resolution volumes, as well as two surfaces used to define the volume of sediments (the top pre-cretaceous basement and topography/bathymetry). Major improvements in this version of the model include a new faulted basement surface that is compatible with the positions and offsets on major fault systems represented in the CFM, and a new high-resolution velocity model in the Salton Trough. A simple program interface, which is available through our website, allows retrieving velocities and densities from a list of geographic coordinates and depth values (http://wacke.harvard.edu:8080/HUSCV/). The model has been successfully applied in various ground motion simulations (Komatitsch et al. 2004; Lovely et al., in press), and will be available through the SCEC/CME Velocity Model Server.

A Southern California-Wide End-to-End SimulationSwaminathan Krishnan (Caltech), Chen Ji (UC Santa Barbara),

Dimitri Komatitsch (University of Pau, France), and Jeroen Tromp (Caltech)

In 1857 a large earthquake of magnitude 7.9 occurred on the San Andreas fault, with rupture initiating at Parkfield in Central California and propagating in a southeasterly direction over a distance of more than 360 km. Such a unilateral rupture produces significant directivity toward the San Fernando and Los Angeles basins. Indeed, newspaper reports of sloshing observed in the Los Angeles river point to long-duration (1-2 min) and long-period (2-8 s) shaking. If such an earthquake were to happen today, it could have a severe impact on present-day tall buildings, especially in the mid-height range. Using state-of-the-art computational tools in seismology and structural engineering, validated using data from the Northridge earthquake, we determine the damage in 18-story steel moment-frame buildings in southern California due to ground motion from two hypothetical magnitude 7.9 earthquakes on the San Andreas fault. Our study indicates that serious damage occurs in these buildings at many locations in the region, leading to wide-spread building closures and seriously affecting the regional economy. For a north-to-south rupture scenario, the peak velocity is of the order of 1 m/s in the Los Angeles basin, including downtown Los Angeles, and 2 m/s in the San Fernando valley, while the peak displacement is of the order of 1 m and 2 m in the Los Angeles basin and San Fernando valley, respectively. For a south-to-north rupture scenario, the peak velocities and displacements go down by a factor of roughly 2.

193

Page 191: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Assessment of Historic Building Inventory and Code Policies in Los Angeles: 1950-2005

Swift, Jennifer (USC), Zarija Tatalovic (USC), John Wilson (USC), Thomas Jordan (USC), and Mark Benthien (USC)

As part of a recent NSF project funded to examine place-based case studies that delimit and explain the temporal and spatial transference of risk, changes in building code policies over the last century and their implementation in the Los Angeles Area are being examined. Major changes in U.S. building codes have consistently followed severe California earthquakes. The early codes contained simple rules for specifying minimum force levels for the design of buildings. The codes have evolved, and are evolving, into complex documents that are intended to manage seismic risk. The changing code policies provide insight into how the vulnerability assessments would have changed over time, as our knowledge of the performance of buildings and the seismic hazard has improved. This information will be utilized to perform historic damage and loss estimation analyses in HAZUS using historical building inventories (aggregated data sets) and back-engineering HAZUS design level mapping schemas.

Finite Element Simulations of Dynamic Shear Rupture Experiments and Path Selection Along Branched Faults

Templeton, Elizabeth (Harvard University), Aurelie Baudet (Institut des Sciences et de l'Ingenieur), and James R. Rice (Harvard University)

The study of dynamically propagating shear cracks along geometrically complex paths is important to understanding the mechanics of earthquakes. We adapted the ABAQUS/Explicit dynamic finite element program to analyze the propagation of shear cracks along branched weakened paths. The configurations for weakened paths correspond to those used in recent laboratory fracture studies of Carl E. Rousseau (Univ. of Rhode Island) and Ares J. Rosakis (Caltech), privately communicated, examining a branched configuration in a manner analogous to their previous study of rupture along a bent fault path [Rousseau and Rosakis, JGR, 2003]. Their experiments involve impact loading of thin plates of Homalite-100, a photoelastically sensitive polymer, which are cut along branched paths and then weakly glued back together everywhere except along a starter notch near the impact site. Strain gage recordings and high speed photography of the isochromatic fringe patterns, lines of constant difference between in plane principal strains, provided characterizations of the transient deformation field associated with the impact and rupture propagation. Rupture propagation speeds along the weakened path before the branch were found in both sub-Rayleigh and intersonic (supershear) regimes and depended on impact velocity in the experimental studies. The choice of rupture path after the branching junction and the speed of rupture propagation along the branched and horizontal of the weakened paths depended on angle of inclination as well as rupture speed along the main path.

For the finite element analyses, we implemented a linear slip-weakening failure model as a user defined constitutive relation within the ABAQUS program, where weakening could be included in either or both of (1) a cohesive part, C = C(slip), of the shear strength that is insensitive to compressive normal stress, sigma, and (2) a frictional part proportional to sigma, with friction coefficient f = f(slip). That is, shear strength = C(slip) + f(slip) sigma. The analyses of impact loading, and rupture nucleation and propagation were carried out in the 2D framework under plane stress conditions.

A set of studies of slip weakening law and impact velocity was done to investigate the relationship between the strength of the interface and the speed of rupture propagation. We studied branches on both the extensional and compressional side of the main weak path. Examining outcomes of simulations in a phase diagram whose axes are f(0) and C(0)/B, where B is proportional to impact

194

Page 192: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

speed and is a characteristic impact stress level, we find that rupture propagation speed on the main fault increases with decreasing C(0)/B, but is not affected by f(0). A branch to the extensional side is always taken and, beyond the branching junction, increasing f(0) decreases the rupture velocity on the continuation of the main fault but increases rupture velocity on the branch.

The speed of rupture along the branch in the analyses depends on both the branch angle and the speed of rupture propagation along the main fault. Whether the rupture is propagating at an intersonic or sub-Rayleigh speed when it reaches the branching junction has a large impact on the nature of rupture propagation along the inclined path. When the rupture propagates along the main fault at an intersonic speed, shear shock waves intersecting the inclined path may trigger rupture along it. While not obtaining extremely close agreement with the high-speed experimental measurements, principal features observed in dynamic isochromatic line patterns could be reproduced.

TeraShake-2: The Next StepsThe SCEC-ITR collaboration

The SCEC ITR collaboration achieved a series of earthquake simulations on the southern segment of the San Andreas fault that brought to light an heretofore unsuspected pattern of seismic hazard, linked to fault geometry, rupture directivity, and wave propagation through the three-dimensional SCEC Community velocity model (Olsen et al, 2005). This set of simulations, labeled TeraShake-1 relied on a kinematic fault rupture model scaled from inversion results for the 1999 Denali earthquake. That model is quite severely constrained, and questions remain about the level of uncertainty we should attach to these results.

The next stage of development is to accommodate a spontaneous rupture model, using one or more of several codes being compared as part of the SCEC spontaneous rupture mode validation effort. There exist about a dozen such codes. We are initially restricting our focus on four such codes: (1) the 2nd-order Dynamic Fault Model (DFM) FD code that has passed numerous tests; (2) the 4th order FD Anelastic Wave Propagation Model (AWM) code, used in the TeraShake-1 and CyberShake projects; (3) the UCSB Finite Element spontaneous rupture and wave propagation code; (4) the SIO “mimetic” Support-Operator Rupture Dynamic (SORD) code. All these are being prepared for incorporation in the SCEC Community Modeling Environment (CME) although they have quite different levels of maturity. All are being described in separate contributions to the SCEC 2005 annual meeting.

A major challenge is that the range of scales involved in such simulations is enormous: the inner scale associated with rupture characteristics is in the range of 1-100 meters, while the outer scale associated with geology, wave propagation, and hazard assessment reaches 600 kilometers. Time scales cover an equally broad range of magnitudes. This combination raises a numerical “grand challenge” that pushes the limits of available computing systems. Additional considerations include desired capabilities to model nonplanar faults, and to include surface topography in the calculation. Recent SCEC research results show that such complications are indeed potentially important for accurate ground motion predictions.

The SCEC-ITR group has devised a consensus approach based on a 2-stage modeling strategy. In a first stage, a spontaneous rupture model is run, on a fine grid with limited geographical extent, using absorbing boundary conditions to control the effects of reflections on the rupture process. The output of this run, a complete rupture history along the fault, is then used as a kinematic source in the SCEC/CME AWM code to propagate the seismic waves to regional distances, as was done in TeraShake-I. The simulations are being done on the TeraGrid collection of platforms. This approach

195

Page 193: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

represents admittedly an approximation that requires validation. This also raises very significant IT challenges, in terms of computational performance, storage, communications, and visualization.

We report on progress achieved so far along all of these lines, and describe our plans to achieve the next level of simulations.

NetMAP, an Online GIS for GeoscientistsTierney, Timothy (CartoGraph.com)

NetMAP was conceived of and developed by a geologist as a solution for the frustrations that geoscientists face with traditional GIS software. It aims to provide basic GIS functionality via a system that is easily learned and which is optimized for sharing and collaboration.

NetMAP is available via login from any web browser. All data is therefore automatically web-ready, making it ideal for distributed multi-author collaboration as well as for public education projects. Data may take the form of either vector or raster and may be interchanged with other GIS software as well as GPS units.

For public education, NetMAP can be used to publish complex map data to the web in an interactive form that allows the user to zoom/pan, query features, activate embedded hyperlinks, etc.

Custom web input forms can be integrated into NetMAP for creation of richly-cataloged geospatial data: at UCSB a 10,000 item collection of field photos is being cataloged with spatial coordinates and extensive descriptive information.

Estimating a Slip Budget along the Parkfield Segment of the San Andreas Fault: A Slip Deficit since 1857

Toké, Nathan (ASU) and Ramón Arrowsmith (ASU)

We interpret the strain release history along the Parkfield segment of the San Andreas fault (SAF) and assess the existence of a slip deficit along this portion of the fault by estimating a slip budget since 1857: Slip deficit = (long term slip-rate * time) – (aseismic slip release + coseismic slip release). We assumed that the 1857 Fort Tejón earthquake released all previously accumulated strain. We calculated the aseismic slip released along the fault by depth-averaging the along-fault creep rates of Murray et al. [Geophys. Res. Lett., 28, 359-362 (2001)] and by using aseismic slip rates from the creeping segment (to the northwest). We assumed these rates have been constant since 1857. We assumed no aseismic slip has occurred along the Cholame segment (to the southeast) since 1857. We used epicenter and magnitude estimates of Toppozada et al. [Bull. Seis. Soc. Amer., 92, 2555-2601 (2002)] and empirical relationships between magnitude, rupture length, and average displacement (Wells and Coppersmith [Bull. Seis. Soc. Amer., 84, 974-1002 (1994)]; modified for strike slip earthquakes in California by Arrowsmith et al. [Seis. Res. Lett., 68, 902-916 (1997)]) to calculate rupture extents and average coseismic slip for historical earthquakes near Parkfield. We assumed symmetric rupture and constant slip centered on the reported epicenter positions projected to the SAF. Additionally, we used more precisely known information about the 1966 and 2004 Parkfield events (e.g. Lienkaemper and Prescott [J. Geophys. Res., 94, 17647-17670 (1989)]) to estimate their rupture lengths and mean slip. We also consider scenarios in which we assume the 2004 and 1966 ruptures are representative of the previous ruptures to estimate the coseismic slip release since 1857. Our results suggest the slip deficit along the northwestern Cholame segment is about 5 m because of an absence of any slip since 1857. This is approximately the mean of the range of 1857 offsets on the Cholame segment. The slip deficit is much greater than the few 1857 offsets

196

Page 194: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

measured in the southeast portion of the Parkfield segment. Thus, our estimation suggests that the slip deficit in southeast Parkfield and Cholame may be as great as or may have surpassed the slip experienced on these segments in 1857. The slip deficit abruptly decreases to the northwest across the central Parkfield segment. It is 0.5-2 m near the town of Parkfield and 0-1 m northwest of Middle Mountain. Assuming a variable rupture model, a M7 event rupturing the all or part of the Cholame segment and the southern Parkfield segment (slip decreasing to the NW) is plausible. Importantly, this result also shows that the change in the pattern of strain release occurs in the middle of the “Parkfield Segment” of the SAF, rather than at its boundaries (northwest of Middle Mountain and at Highway 46).

From Physical Modeling to Scientific Understanding: An End-to-End Approach to Parallel Supercomputing

Tu, Tiankai (CMU), Hongfeng Yu (UC Davis), Leonardo Ramirez-Guzman (CMU), Jacobo Bielak (CMU), Omar Ghattas (UT Austin), Kwan-Liu Ma (UC Davis), and

David O'Hallaron (CMU)

Conventional parallel scientific computing uses files as interface between simulation components such as meshing, partitioning, solving, and visualizing. This approach results in time-consuming file transfers, disk I/O and data format conversions that consume large amounts of network, storage, and computing resources while contributing nothing to applications. We propose an end-to-end approach to parallel supercomputing. The key idea is to replace the cumbersome file interface with a scalable, parallel, runtime data structure, on top of which all simulation components are constructed in a tightly coupled way. We have implemented this new methodology within an octree-based finite element simulation system named Hercules. The only input to Hercules is material property descriptions of a problem domain; the only outputs are lightweight jpeg-formatted images generated as they are simulated at every visualization time step. There is absolutely no other intermediary file I/O. Performance evaluation of Hercules on up to 2048 processors on the AlphaServer system at Pittsburgh Supercomputing Center has shown good isogranular scalability and fixed-size scalability.

SCEC/UseIT: Creating an Interactive CD-ROM to Showcase Intern ResearchVan Buren, Jason (Santa Monica College)

As a member of the Undergraduate Studies in Earthquake Technology internship program (UseIT), a research initiative merging the fields of earthquake science and computer technology, I have been responsible for creating a CD-ROM to showcase the work produced by the program. The contents of the disc include graphic visualizations created by interns, as well as interconnectivity to the website where intern-created software such as LA3D and SCEC-VDO can be demonstrated and/or downloaded. Also on the disc are pictures of the interns and interviews. Pictures and copy from field trips and guest scientists are also included.

The purpose of this project is to demonstrate the work that UseIT interns are involved in and to showcase their accomplishments, as well as to provide information to prospective interns. Information on applying to the program is included into the disc, as well as a link to the online application. The disc can easily be played on any computer with a CD-ROM drive. The content is designed using HTML, so that it can be viewed simply in an internet browser. This then provides the ability for a seamless interface between updated and new content online, and content from the CD. Also involved in this process is the re-design of the UseIT website, which mirrors much of the content on the CD and provides updated and newly added resources.

197

Page 195: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

My research project was to create this disc from concept to completion, handling respects from video editing to photography, to web design, to graphic disc design, from planning stages to fulfilling development, all in a short amount of time and under a tight budget. In order to be able to represent the various projects of each of the interns, I had to understand each person’s research. I also collaborated with other interns to create a new logo for UseIT, which now serves as the graphic identity for the program. When the project was complete, I created a final master disc and sent it off to be replicated. We now have a physical representation of the UseIT program to hand out to both the science community and prospective interns.

Creep Measurements and Depth of Slip along the Superstition Hills Faults as Observed by InSAR

Van Zandt, Afton J. (SCEC/SURE Intern, SDSU) and R. J. Mellors (SDSU)

Data from 65 ERS-1 and ERS-2 interferograms (descending, track 356, frame 2943) covering the Western Salton Trough and spanning a time period from 1992 to 2000 are used to measure surface deformation along the Superstition Hills fault. We model the near-fault (within 5 km) deformation along the Superstition Hills fault using a 2D analytic model of a vertical strike-slip fault. We assume all the observed signal is due to shallow slip (above the seismic zone). Using data from 4 cross-sectional profiles of interferograms across the fault, we find an average slip rate of 7.2 ± 2.1 mm extending to a depth of 3.6 ± 1.5 km. The lower bound of the shallow creep appears to increase to the northwest along the Superstition Hills fault.

Prediction of Triggered Slip and Aftersocks in the Salton Trough: Which is Better, Dynamic or Static Coulomb Failure Stresses?

Verdugo, Danielle (SDSU), Julia Clark (SDSU), Kim Olsen (SDSU), and Rob Mellors (SDSU)

We have modeled static (CFS) and dynamic (dCFS(t)) Coulomb failure stresses within a 140 km by 140 km area in southern California for four recent historical M>6 earthquakes (1968 Borrego Valley, 1979 Imperial Valley, 1987 Elmore Ranch and 1987 Superstition Hills events) using a fourth-order finite-difference method and a layered crustal model. The dynamic stresses, quantified as the peak values of the dCFS(t), are computed using slip models estimated from measured suface offsets and extended down to the bottom of the fault. The CFS and dCFS(t) are correlated with aseismic slip recorded on nearby structures in the Salton Trough, as well as aftershocks from the four events. Our simple models suggest that, compared to static Coulomb failure stress patterns, the patterns of peak dCFS(t) show a higher correlation with triggered slip along nearby faults, and for some of the events, with the location of aftershocks that occurred up to four years after the events. This finding is due to the rupture propagation effects, particularly directivity, included in the dCFS(t) (e.g., Kilb et al, 2000), but omitted for the CFS. Future studies should include a 3D crustal model and refined rupture propagation in the dCFS(t) computation.

A Vertical Motion Database for Southern CaliforniaVerdugo, Danielle (SDSU), Mike Oskin (UNC), Tom Rockwell (SDSU), and

Nathan Niemi (Caltech)

The vertical motion database for Southern California is a compilation of geologic data, reference frame information, and processing tools to determine vertical crustal motions at 104 – 106 year time-

198

Page 196: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

scales. All original data, reference frames, and processing information are encapsulated within a PostgreSQL object-relational database. Querying data proceeds interactively via a web interface to the database through three steps: (1) select data points, optionally filtered by location and data type, (2) select one of the appropriate reference frames for each data type in the selected set, and (3) process the data points into vertical motion rates. Data compilation efforts are complete for marine terraces from central California to the border with Mexico. The majority of these data are for terraces formed 80 – 120 ka near the present coastline, with a few older points inland. Thermochronology data available for the Transverse Ranges have been compiled to provide exhumation rates (a proxy for uplift rates) at million-year time scales. River terrace and aquifer elevation data have also been added and include: Ventura River terraces, Los Angeles River terraces (along the Elysian Park anticline), Santa Ana River terraces (Yorba Linda terrace, Grand Terrace, as well as the Santiago, San Timoteo and Reche Creek terraces), and the San Gabriel River terraces. Efforts are ongoing to incorporate compiled stratigraphic horizon information into the database, and challenges remain in bridging reference frames between the coastal and interior basins. The web interface for obtaining and processing information from the vertical motion database is available at geomorph.geosci.unc.edu/vertical. Results may presently be viewed online in table format, downloaded as a GIS-compatible file, or browsed via the Google Maps web service.

A Survey of 71 Earthquake Bursts across Southern California: Exploring the Role of Pore Fluid Pressure Fluctuations and Aseismic Slip as Drivers

Vidale, John E. (UCLA) and Peter Shearer (UCSD)

We investigate the cause of seismicity swarms by examining a waveform-relocated catalog for southern California between 1984 and 2002 and systematically identifying 71 isolated sequences of 40 or more earthquakes occurring within a 2-km-radius volume and a four-week interval. 57 of the 71 bursts are difficult to interpret as primarily a mainshock and its Omori-law-abiding foreshocks and aftershocks because they exhibit a more complicated evolution in space, time, and magnitude; we identify 18 of these sequences as particularly swarm-like. Evidence against a simple cascade of elastic stress triggering includes the presence of an interval of steady seismicity rate, the tendency of the largest event to strike later in the sequence, the large spatial extent of some of the swarms compared to their cumulative moment, and the weak correlation between the number of events in each burst and the magnitude of the largest event in each burst. Shallow sequences and normal-faulting-mechanism sequences are most likely to be swarm-like. The tendencies of the hypocenters in the swarm-like sequences to occur on vertical planes and expand over time suggest pore fluid pressure fluctuations as the most likely mechanism driving the swarm-like seismicity bursts. However, episodic aseismic slip could also be at least partly responsible, and might provide a more compelling explanation for the steady rate of seismicity during swarms, whereas fluid pressure perturbations might be expected to diminish more rapidly with time. Both aftershock-like and swarm-like seismicity bursts are distributed across the entire study region, indicating that they are a general feature of tectonic faulting, rather than limited to a few geological conditions such as volcanic or geothermal areas.

Progress in Waveform Based Wide Area Event Relocation in Northern California

Waldhauser, Felix (LDEO) and David Schaff (LDEO)

We are in the process of relocating 225,000 seismic events in Northern CA using both bulletin data from the Northern California Seismic Network (NCSN) and a recently computed comprehensive data base of cross-correlation differential times for earthquakes between 1984-2003. The correlation data base includes a total of about 3 billion P- and S-wave differential times from pairs of waveforms with

199

Page 197: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

cross correlation coefficients (CC) of 0.6 or greater, for events separated by less than 5 km. Approximately 90% of the seismicity includes events that have CC > 0.7, with at least one other event recorded at four or more stations. Large numbers of correlated events occur in different tectonic regions, including the San Andreas Fault, the Long Valley caldera, Geysers geothermal field and the Mendocino triple junction. We present double-difference relocations for about 80,000 earthquakes along the San Andreas Fault system, from north of San Francisco to Parkfield, covering a region that includes simple strike slip faults (creeping and locked) and complex structures such as fault segmentations and step overs. A much sharper picture of the seismicity emerges from the relocated data, indicating that the degree of resolution obtained in recent, small scale relocation studies can be obtained for large areas and across complex tectonic regions. We are also comparing our results with others from recent relocation work by Hauksson and Shearer (H&S), who are also using cross-correlation and multi-event location techniques. Comparing our differential time measurements with those obtained by H&S for identical earthquake pairs recorded near Mendocino, CA, for example, we find that about 80% of the measurements agree within 10 ms (Note: 10 ms is also the sampling rate of the stations we compared). Both studies employ time domain cross correlation techniques, but use different interpolation functions, window lengths and outlier detection methods.

The 2001 and 2005 Anza Earthquakes: Aftershock Focal MechanismsWalker, Kris (IGPP/SIO), Debi Kilb (IGPP/SIO), and Guoqing Lin (IGPP/SIO)

Two M ~5 earthquakes occurred generally within the Anza seismic gap along the San Jacinto Fault zone during the last 4 years (M 5.1, October 31, 2001; M 5.2, June 12, 2005). The 2005 event occurred ~9 km southeast of the town of Anza, and the 2001 event was ~6 km farther southeast. These events have significantly different focal mechanisms, and it is unclear if they occurred on a northwest-striking fault parallel to the San Jacinto Fault or a conjugate northeast-striking fault. Both events were followed by productive aftershock sequences (Feltzer, and Shearer et al., this meeting). Significant post-seismic creep was recorded several days following the mainshock by strain meters near Anza (Agnew and Wyatt, this meeting). In light of these observations, several questions arise regarding the focal mechanisms and spatial/temporal behavior of the mainshocks and associated aftershocks: (1) how similar are the two sequences; (2) does the data define a well-delineated fault system consistent with surface observations; and (3) is there a spatial/temporal evolution or clustering of the aftershock focal mechanisms? To investigate these questions we calculate focal mechanisms using polarity information from the SCEC catalog, relocate aftershocks using a waveform cross-correlation technique, and explore the data using 3D visualizations (Kilb et al., this meeting). We use a clustering algorithm to identify similar focal mechanism types, and search for trends in the occurrences of these events as a function of space and time. The spatial distribution of the relocated aftershocks appear ‘cloud like’, not aligning with a narrow fault core. Similarly, the aftershock focal mechanisms are heterogeneous, in that the 2001 and 2005 sequences are only comprised of 42% and 64% strike-slip events, respectively. These values are reduced to 25% and 46% when we consider only strike-slip mechanisms that are consistent with the strike of the San Jacinto Fault. In addition, there is a relatively large proportion of normal-faulting aftershocks in the 2001 sequence (18%) relative to the 2005 sequence (7%). These results suggest that both aftershock zones are highly fractured and heterogeneous volumes.

200

Page 198: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

GPS Installation Progress in the Southern California Region of the Plate Boundary Observatory

Walls, Chris, Ed Arnitz, Scott Bick, Shawn Lawrence, Karl Feaux, and Mike Jackson (UNAVCO-PBO)

One of the roles the Plate Boundary Observatory (PBO), part of the larger NSF-funded EarthScope project, is the rapid deployment of permanent GPS units following large earthquakes to capture postseismic transients and the longer-term viscoelastic-response to an earthquake. Beginning the day of the September 28th, 2004, Parkfield earthquake, the PBO Transform Site Selection Working Group elevated the priority of two pre-planned GPS stations (P539 and P532) that lie to the south of the earthquake epicenter, allowing for reconnaissance and installation procedures to begin ahead of schedule. Reconnaissance for five sites in both the Southern and Northern California offices began the day following the earthquake and two permits were secured within three days of the earthquake. Materials and equipment for construction were brought along with the response team and within 4 days the first monument (P539) was installed.

Of the 875 total PBO GPS stations, 212 proposed sites are distributed throughout the Southern California region. These stations will be installed over the next 3 years in priority areas recommended by the PBO Transform, Extension and Magmatic working groups. Volunteers from the California Spatial Reference Center and others within the survey community have aided in the siting and permitting process. Currently the production status is: 59 stations built (23 short braced monuments, 36 deep drilled braced monuments), 72 permits signed, 105 permits submitted and 114 station reconnaissance reports. To date, Year 1 and 2 production goals are on schedule and under budget.

Combined and Validated GPS Data Products for the Western USWebb, Frank (JPL), Yehuda Bock (UC San Diego, Scripps Institution of

Oceanography), Dong Danan (JPL), Brian Newport (JPL), Paul Jamason (SIO), Michael Scharber (SIO), Sharon Kedar (JPL), Susan Owen (JPL),

Linette Prawirodirjo (SIO), Peng Fang (SIO), Ruey-Juin Chang (SIO), George Wadsworth (SIO), Nancy King (USGS), Keith Stark (USGS),

Robert Granat (JPL) and Donald Argus (JPL)

The purpose of this project is to produce and deliver high quality GPS time series and higher-level data products derived from multiple GPS networks along the western US plate boundary, and to use modern IT methodology to make these products easily accessible to the community.

This multi-year NASA funded project, "GPS DATA PRODUCTS FOR SOLID EARTH SCIENCE" (GDPSES), has completed the product development phase and automation of Level-1 Data products. The project processes and posts a daily solution generated by a combination of two independent GPS station position solutions, generated at SIO and JPL using GAMIT and GIPSY respectively. A combination algorithm 'st_filter' (formerly known as QOCA) has been implemented. A combined ~10-year long time-series for over 450 western US GPS sites is available for viewing and download for the scientific community via the project's web portal at http://reason.scign.org. In addition to ongoing product generation GDPSES has a capability to reprocess and combine over a decade of data from the entire western US within a few days, which enables a quick update of the Level-1 products and their derivatives, when new models are tested and implemented.

To achieve the project goals and support current data products, several ongoing IT developments are taking place. In the forefront is an Adaptive Seamless Archive System, which uses web services for GPS data discovery, exchange and storage. GDPSES has unified the station data and metadata inputs into the processing procedures at the independent analysis centers. The project has developed

201

Page 199: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

XML schemas for GPS time series, and is developing and implementing an array of data quality tools, to ensure a high-quality combined solution, and to detect anomalies in the time series. Event leveraging will alert users to tectonic, anthropogenic and processing 'events'. In the next few months the project, through its new data portal called GPS Explorer, will enable users to zoom in and access subsets of the data via web services. As part of its IT effort the project is participating in NASA's Earth Science Data Systems Working Groups (ESDSWG), and contributing to the Standards working group.

Slip Rate of the San Andreas Fault near Littlerock, CaliforniaWeldon, Ray (U Oregon) and Tom Fumal (USGS)

Two offsets, 18+/-2 and ~130 m, across the Mojave portion of the San Andreas fault yield slip rates of ~36 mm/yr (uncertainties associated with the two groups of offsets will be discussed separately below). These data are consistent with the slip rate inferred at Pallett Creek [8.5 kms to the SE (Salyards et al., 1992)], the local long-term rate [averaged over ~2 Ma, (Weldon et al., 1993) and over ~400 ka (Matmon et al. 2005)], and kinematic modeling of the San Andreas system (Humphreys and Weldon, 1994). These results, combined with the earlier work, suggest that the rate has been constant at the resolution of geologic offsets, despite the observation that the decadal geodetic rate is interpreted to be 5-15 mm/yr lower.

Two small streams and a terrace riser are each offset 18+/-2 m by the two active traces of the San Andreas fault at the site. Evidence from trenches at one of the 18 m offsets are interpreted to show that it was caused by 3 earthquakes; the first of which closed a small depression into which pond sediments were subsequently deposited. The youngest C-14 sample below the pond is dated at 372+/-31 C-14 yr BP (dates are reported in C-14 years, but slip rates are calculated using calibrated years), and the oldest consistent sample in the pond sediments is 292+/-35 BP. These dates are consistent with the 3rd earthquake back (Event V) at the nearby Pallett Creek paleoseismic site. If one makes simplifying assumptions, including a time or slip predictable model to relate dated offsets to slip rate, and use the better-constrained ages of the paleoquakes at Pallett Creek, one can calculate a slip rate of 36 +/- 5 mm/yr. A more conservative interpretation, using the variability in recurrence intervals and offsets seen on this part of the fault to allow for the possibility that the recent 3 events are just one realization of a range of possible 3 event sequences, yields a slip rate of 36 +24/-16 mm/yr.

A 3520 +/-220 BP channel deposit offset by 130 +/-70 m may also yield a slip rate of ~36 mm/yr. It is difficult to assess the uncertainty associated with this best estimate because the range includes 3 different interpretations (~200, ~130, and ~65 m) that are mutually exclusive. Our preferred interpretation requires that the canyon on the NE side of the fault captured the broad valley to the SW when the two lows in the fault parallel ridges were first juxtaposed by RL slip, not when the largest drainages on each side were aligned. Following at least 50 m of RL offset and 15-20 m of incision, alluviation deposited the 3520 BP channel and subsequent incision set the major drainage on the SW side across that on the NE side, isolating and preserving the dated deposit. Efforts are underway to better constrain the geometry of the ~130 m offset, and to determine the offset of 900 to 1200 BP deposits to provide an intermediate slip rate estimate.

Revision of Time-Independent Probabilistic Seismic Hazard Maps for AlaskaWesson, Rob, Oliver Boyd, Chuck Bufe, Chuck Mueller, Art Frankel, and

Mark Petersen (USGS Golden)

We are currently revising the probabilistic seismic hazard maps of Alaska and the Aleutians. Although analysis of the seismic hazard in Alaska differs from Southern California in that a subduction zone is

202

Page 200: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

present, both Alaska and Southern California face large seismic hazard from crustal strike-slip and thrust faults. In addition to preparing time-independent hazard maps, we are also preparing experimental maps including time-dependent earthquake probability estimates. Modifications to the previous version of the time-independent maps, following workshops held in Alaska, include

1) splitting the 1964 zone into a 1964 segment and Kodiak Island segment to account for the evidence that the Kodiak Island segment appears to have a recurrence interval one-half that of a 1964-type event,2) adding a segment southwest of Kodiak Island that accounts for geodetic evidence suggesting that the overriding plate is not coupled to the subducting plate,3) reducing the depth to the subduction zone beneath Anchorage,4) accounting for recent work that suggests that the slip rate along the Castle Mountain fault may be as high as 2 mm/yr as opposed to the value of 0.5 mm/yr used in 1999,5) reducing the slip rate along the Totschunda to 6 mm/yr , previously 11.5 mm/yr, and increasing the slip rate along the Eastern Denali to near 7 mm/yr, previously 2 mm/yr,6) including a subduction zone segment at the far western end of the Aleutian arc.

We have also modified the hazard programs to allow for position dependent probabilities. This has allowed us to very the probability for large earthquakes along the eastern Denali, from a value that reflects 7 mm/yr of right lateral slip where it meets the central Denali to one that reflects 2 mm/yr at its southern end. This modification has also allowed us to cascade earthquakes between the central and eastern Denali and the central Denali and Totschunda faults.

Development Roadmap for the PyLith/LithoMop/EqSim Finite Element Code for Fault-Related Modeling in Southern California

Williams, Charles (RPI), Brad Aagaard (USGS-Menlo Park), and Matt Knepley (ANL/CIG)

The Fault Systems Crustal Deformation Working Group plans to produce high-resolution models of coseismic and interseismic deformation in southern California using the Community Block Model as a basis. As part of this effort we are developing a finite element code capable of modeling both quasi-static and dynamic behavior in the solid earth. The quasi-static code, presently known as LithoMop, has evolved from our previous version of the TECTON finite element code. We plan to combine this code with the EqSim dynamic rupture propagation code to provide a new package known as PyLith. This combined package will be able to simulate crustal behavior over a wide range of spatial and temporal scales. For example, it will be possible to simulate stress evolution over numerous earthquake cycles (a quasi-static problem) as well as the rapid stress changes occurring during each earthquake in the series (a dynamic problem).

We describe here the current development status of the PyLith components, and provide a roadmap for code development. The PyLith package will make use of the Pyre simulation framework, and will use PETSc for code parallelization. The package will also make use of a powerful and flexible new method of representing computational meshes, Sieve, presently being developed as a part of PETSc. Sieve will greatly simplify the task of parallelizing the code, and will make it much easier to generalize the code to different dimensions and element types. A version of LithoMop using the Pyre framework is presently available. It uses PETSc serial solvers for solution of the linear system of equations, and we plan to have a fully parallel version of the code available within the next month or two. A version of EqSim that uses the Pyre framework, PETSc, and Sieve is currently under development with a planned release in the Spring of 2006. An initial version of PyLith is planned for the Summer of 2006.

203

Page 201: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Southernmost San Andreas Fault Rupture History: Investigations at Salt Creek

Williams, Patrick (Williams Assoc.) and Gordon Seitz (SDSU)

The earthquake history of the southernmost San Andreas fault (SSAF) has implications for the timing and magnitude of future ruptures of the southern portion of the fault, and for fundamental properties of the transform boundary. The SSAF terminates against the “weak” transtensional Brawley seismic zone, and conjunction of the fault against this ~35-km-wide extensional step isolates the SSAF from transform faults to the south. SSAF ruptures are therefore likely to be relatively independent indicators of elastic loading rate and local fault properties. Knowledge of whether SSAF ruptures are independent of, or participate in ruptures of the bordering San Bernardino Mountain segment of the San Andreas fault is essential for full modeling of the southern San Andreas.

To recover long-term slip parameters and rupture history for the SSAF, geological evidence of its past motion has been investigated at Salt Creek California, about 15 km from the SSAF’s transition to the Brawley seismic zone. Sediments dated at AD1540±100 are offset 6.75±0.7m across the SSAF at Salt Creek. Beds with an age of AD1675±35 are offset 3.15±0.1m. Williams (1989), and Sieh and Williams (1990) showed that near Salt Creek, ~1.15m of dextral slip accumulated aseismically over the 315-year AD1703-1987 period, yielding a creep rate of 4±0.7 mm/yr. If similar creep behavior held through the shorter AD1540-1675 interval (135±105yr), net seismic surface displacement at Salt Creek was ~2m in the latest event, and ~3m in the prior event. Slip rate in the single closed interval is not well constrained due its large radiocarbon calibration uncertainty.

The hiatus between ultimate and penultimate ruptures was at least 100 years shorter than the modern quiescent period of 335±35 years. This indicates a very high contemporary rupture hazard, and given the long waiting time, suggests that the fault’s next rupture will produce a significantly larger displacement than the two prior events.

Paleoseismic and neotectonic studies of the Salton Trough benefit from repeated flooding of the Trough’s closed topographic basin by the Colorado River. Ancient “Lake Cahuilla” reached an elevation of 13m, the spillpoint of the basin, at least five times during the past ~1200 years (Waters, 1983; Sieh, 1986; data of K. Sieh in Williams, 1989). Flood heights were controlled by stability of the Colorado River delta during this interval.

Ongoing studies show excellent promise for recovery of the relationship between lake chronology and the San Andreas earthquake record. We have recovered sediment evidence at a new Salt Creek site (Salt Creek South) of five flood cycles in the modern 1200 year period. The lake record contains conditional evidence of six paleoearthquakes, potentially a more complete record than that developed for the adjoining Mission Creek branch of the SSAF by Fumal, Rymer and Seitz (2002). Continuing and planned work includes (i) high resolution age dating of lake and interlake record, (ii) establishment of more robust field evidence for all the interpreted events, and (iii) recovery of a slip-per-event history for the latest 3-4 events.

Loss Estimates for San Diego County due to an Earthquake along the Rose Canyon Fault

Wimmer, Loren (SCEC/SURE Intern, SDSU), Ned Field (USGS), Robert J. Mellors (SDSU), and Hope Seligson (ABS Consulting Inc.)

A study was done to examine possible losses to San Diego County should a full-fault earthquake-rupture occur along the Rose Canyon fault, which runs directly through portions of San Diego and is evidenced by Mt. Soledad and the San Diego Bay. The total length of the fault is ~70 km (including the Silver Strand fault). Following the 2002 National Seismic Hazard Mapping Program, we consider

204

Page 202: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

a full fault rupture to be between magnitude 6.7 and 7.5, with the most likely magnitude being 7.0. Using this range of magnitudes, sampled at every 0.1 units, and six different attenuation relationships, 54 different shaking scenarios were computed using OpenSHA (www.OpenSHA.org). Loss estimates were made by importing each scenario into the FEMA program HAZUS-MH MR1. The total economic loss is estimated to be between $7.4 and $35 billion. The analysis also provides the following estimates: 109 – 2,514 fatalities, 8,067 – 76,908 displaced households, 2,157 – 20,395 in need of short term public shelter, and 2 - 13 million tons of debris generated. As in a previous study done on the effect of a Puente Hills earthquake in Los Angeles, this study shows the effect of attenuation relationship choice to have a greater effect on predicted ground motion then the choice of magnitude, thus leading to larger uncertainty in the loss estimates. A full fault rupture along the Rose Canyon fault zone would be a rare event, but due to the proximity of the fault to the City of San Diego, the possibility is worth consideration for possible mitigation efforts.

Extending the Virtual Seismologist to Finite Ruptures: An Example from the Chi-Chi Earthquake

Yamada, Masumi (Caltech) and Thomas Heaton (Caltech)

Earthquake early warning systems collect seismic data from an occurring event, analyze them quickly, and provide estimates for location and magnitude of the event. Recently, according to advances in data analysis and an increased public perception of seismic hazards, the topic of early warning has attracted more research attention from seismologists and engineers. Cua and Heaton developed the Virtual Seismologist (VS) method (2004 Cua); it is a Bayesian approach to seismic early warning designed for modern seismic networks. The algorithm of the VS method uses the envelope attenuation relationship and the predominant frequency content

of the initial 3 seconds of the P-wave at a station. It gives us the best estimate of the earthquake property in terms of the probability function.

We extend this VS method to large earthquakes where the fault finiteness is important. The general VS method is proposed for small earthquakes whose rupture can be modeled with a point source. It does not account for radiation pattern, directivity, or fault finiteness. Currently, the VS method uses the acceleration, velocity, and 3 second high-pass filtered displacement for data analysis. However, for larger earthquakes, we need to consider the finite rupture area and lower frequency ground motion.

We introduce the multiple source model to express the fault finiteness. A fault surface is divided into subfaults and each subfault is represented by a single point source. The ground motion at a site is expressed by combination of the responses corresponding to each point source. This idea was developed by Kikuchi and Kanamori (1982), they deconvolve complex body waves into multiple shock waves. We find that the square root of the sum of the squares of the envelope amplitudes of the multiple point sources provides a good estimation for an acceleration envelope. Low frequency motions are important to understand the size of the slip along the fault. Since the peak ground acceleration (PGA) tends to saturate with respect to magnitude, the displacement records are more useful to get information on slip, which controls the ultimate magnitude of the event. A methodology to estimate the size of the slip from the displacement envelopes is being developed by Yamada and Heaton. Records at the stations near the fault surface include information on the size of the slip, so we first classify near-field records and far-field records by using linear discriminant analysis (LDA). In general, LDA requires placing observations in predefined groups, and finding a function for discriminating between groups. We use this function to classify future observations into the predefined groups. We found that the higher frequency components (e.g. acceleration) have high correlation with the distance from the fault and the LDA with PGA and the peak of the derivative of the acceleration classifies the data with 85% accuracy.

205

Page 203: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Regional Mapping of Crustal Structure in Southern California Using Receiver Functions

Yan, Zhimei and Robert W. Clayton (Caltech)

Lateral variations of crustal structure in Southern California are determined from receiver function studies using data from the broadband stations of the Southern California Seismic Network (SCSN) and LARSE surveys. The results include crustal thickness estimates at the stations themselves, and where possible, cross-sections are presented. Large rapid variations in the crustal structure are observed beneath the San Gabriel Mountains and a root where the Moho ramps in depth from neighboring 30-33 km to 36-39 km is imaged beneath the central part of the San Gabriel Mountains. A negative impedance, similar in depth to the bright spot imaged by Ryberg and Fuis is also commonly, but not consistently observed in the San Gabriel Mountain stations. Relatively flat Moho of about 28-30 km is observed in the western Mojave Desert, but a shallow Moho of about 23-27 km is observed in the eastern Mojave Desert. A sudden Moho depth jump of about 8 km occurs beneath the Fenner Valley, east of Amboy, CA (station DAN) over a lateral distance of no more than 12 km. Unusual receiver functions, including Pms arrivals, are observed for some stations in the trans-tensional zones of Dokka’s kinematical model, such as the station beneath Barstow (RRX). This indicates that openings between different blocks and thus rotation of the blocks in this type of model might extend to the Moho. A negative impedance directly beneath the Moho, corresponding to a low-velocity zone, is observed in several eastern Mojave Desert stations, which could be related to the weak upper mantle lithosphere in that area. Asymmetric extension of Salton Sea is observed --- gradual thinning to the west, and sharp transition to the east. Crustal thickening and local variations are also observed under central Sierra and nearby central Basin and Ranges.

Analysis of Earthquake Source Spectra from Similar Events in the Aftershock Sequences of the 1999 M7.4 Izmit and M7.1 Duzce Earthquakes

Yang, Wenzheng (USC), Zhigang Peng (UCLA), and Yehuda Ben-Zion (USC)

We use an iterative stacking method (Prieto et al., 2004) to study the relative source spectra of similar earthquakes in the aftershocks of the 1999 Mw7.4 Izmit and Mw7.1 Duzce earthquake sequences. The initial study was done using a tight cluster of 160 events along the Karadere segment and recorded by 10 short-period stations. We compute the P-wave spectra using a multitaper technique, and separate the stacked source and receiver-path spectra terms iteratively from the observed spectra. The relative log potency computed from the low-frequency amplitude of the source spectra term scales with the local magnitude with a slope close to 1. An empirical Green's function (EGF) is used to correct the attenuation and derive the relative spectra shapes of events in different potency/moment bins. The receiver-path spectra term for stations inside or close to the fault zone are larger than those of other stations. This may be related to the fault-zone trapping structure and related site effects in the area discussed by Ben-Zion et al. (2003). The continuing work will focus on estimating the corner frequency, stress drop, and radiated seismic energy from the relative source spectra. The same method will be applied to other similar earthquake clusters identified by Peng and Ben-Zion (2005), and larger events (M3-5) recorded by the strong ground motion instruments. Updated results will be presented in the meeting.

206

Page 204: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Significance of Focal Depth in Primary Surface Rupture Accompanying Large Reverse-fault Earthquakes

Yeats, Robert S. (Oregon State University), Manuel Berberian (Najarian Associates),and Xu Xiwei (China Earthquake Administration)

We are creating a worldwide database of historical reverse-fault earthquakes (see table in poster) because empirical relations between surface rupture and magnitude are less clear for earthquakes on reverse faults than for earthquakes on normal or strike-slip faults. Only a few historical earthquakes (Zenkoji, Rikuu, Tabas-e-Golshan, Bo'in Zahra, Chi-Chi, El Asnam, San Fernando, Susitna Glacier, Latur, Ungava, and several in Australia) are fully expressed by surface rupture, mostly (and probably entirely) due to shallow focal depth. The Meckering, Australia, earthquake nucleated at a depth of only 3 km. Some earthquakes (Suusamyr, Hongyazi, Gulang, Manas, Bhuj, Inangahua, Spitak, and Chengkung) are only partially expressed as surface rupture, and still others (Coalinga, Northridge, Gazli, Racha, Limón, Niigata-ken Chuetsu, Sirch, Kangra, and Nepal-Bihar) are blind-thrust earthquakes. Suusamyr, Bhuj, Loma Prieta, Northridge, and Inangahua nucleated near the brittle-ductile transition. Coalinga, Racha, and Kangra ruptured at shallow depths on low-angle thrusts that did not reach the surface.

Many reverse-fault earthquakes, probably including all with extensive surface rupture, nucleated within rather than at the base of seismogenic crust. The shallow depth of these earthquakes may be because the maximum compressive stress is horizontal, and the overburden stress is the minimum compressive stress. Toward the surface at shallow depths, overburden stress should decrease more rapidly than horizontal stress, resulting in an increase in the shear stress as the surface is approached. Historical earthquakes at the Himalayan front with no surface rupture are shown by paleoseismic trenching to be preceded by earthquakes with extensive surface displacement that were not documented in historical records. Rather than being larger than the historical earthquakes, the earlier surface-rupturing earthquakes, if they ruptured rocks of lower rigidity near the surface, might have been of lower magnitude with less strong ground motion. They might also have been slow earthquakes and in part aseismic. This is illustrated by two thrust-fault events on the Gowk fault zone in central Iran, the 1981 Sirch earthquake of Mw 7.1 with focal depth 17-18 km and the 1998 Fandoqa earthquake of Mw 6.6 with a focal depth of 5 km. The larger earthquake had limited surface rupture on the Gowk strike-slip fault in its hanging wall; the smaller Fandoqa earthquake had much more extensive surface rupture on the Gowk fault and aseismic slip on the Shahdad thrust, the near-surface equivalent of the Sirch source fault.

Trench excavations document surface-rupturing earthquakes but only rarely earthquakes nucleating near the base of the seismogenic zone. For this reason, recurrence intervals of reverse-fault earthquakes based on trenching are maxima. Paleoseismic evidence for deeper crustal earthquakes is difficult to find, although some progress has been made.

Sliding Resistance Rocks and Analog Materials at Seismic Slip Rates and Higher

Yuan, Fuping (CWRU) and Vikas Prakash (CWRU)

Determining the shear resistance on faults during earthquakes is a high priority concern for researchers involved with fault and rock mechanics. Knowledge of shear resistance and how it depends on slip velocity, slip distance, normal stress, etc. is required fundamental information for understanding earthquake source physics. In the present poster, results of two relatively new experimental techniques that have been recently developed at CWRU to investigate high-speed friction in analog materials and rocks are presented. These experimental techniques are (a) plate impact pressure-shear friction experiment, and (b) the modified torsional Kolsky bar friction

207

Page 205: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

experiment. The plate impact experiments were employed to study a variety of friction states with normal stress varying from 0.5 to 1 GPa and slip speeds ranging from 1 to 25 m/s. The torsional Kolsky bar experiments were employed to study interfacial friction at normal stress ranging from 20 to 100 MPa and slip velocities of up to 5 m/s. Using these techniques plate impact pressure-shear friction experiments were conducted on soda-lime glass and fine grained novaculite rock, while the modified torsional Kolsky bar was used to conduct experiments on quartz, glass specimens. The results of these experiments provide time resolved history of interfacial tractions, i.e. the friction stress and the normal stress, and an estimate of the interfacial slip velocity and temperature. The glass-on-glass experiments conducted using the pressure-shear plate impact friction configuration shows a wide range of friction coefficients (from 0.2 to 1.3) can result during the high speed slip; the interface shows no-slip initially, followed by slip weakening, strengthening, and then seizure. For the novaculite rock the initial no-slip and the final seizure conditions are absent. Moreover, the results of the Torsional Kolsky bar experiments indicate that despite high friction coefficients (~ 0.5 - 0.75) during slow frictional slip, strong rate-weakening of localized frictional slip at seismic slip rates can occur leading to friction coefficients in the range of 0.1 to 0.3, at least before macroscopic melting of the interface can occur.

Evaluating the Rate of Seismic Moment Release: A Curse of Heavy Tails Zaliapin, Ilya (UCLA), Yan Kagan (UCLA), and Rick Schoenberg (UCLA)

Applied statistical data analysis is commonly led by the intuition of researchers trained to think in terms of "averages", "means", and "standard deviations". Curiously, such a taxonomy might be misleading for an essential part of relevant natural processes.

Seismology presents a superb example of such a situation: One of its fundamental laws that describes the distribution of the seismic moment release has (in its simplest form) both infinite mean and standard deviation, which, formally, makes the notions like "moment rate" ill defined. Different types of natural taper (truncation) have been suggested to "tame" the moment distribution, but the dramatic discrepancies between the observed seismic moment release and its geodetic predictions are still reported.

We show how the reported discrepancies between observed and predicted seismic moment release can be explained by the heavy-tailed part of the moment distribution. We also discuss some statistical paradoxes of the heavy-tailed sums that affect approximations for the cumulative seismic moment release. Several analytical and numerical aproaches to the problem are presented. We report several very precise methods to approximate the distribution of the cumulative moment release from arbitrary number of earthquakes and illustrate these methods using Californian seismicity.

An Earthquake Source Ontology for Seismic Hazard Analysis and Ground Motion Simulation

Zechar, Jeremy D. (USC), Thomas H. Jordan (USC), Yolanda Gil (USC/ISI), and Varun Ratnakar (USC/ISI)

Representation of the earthquake source is an important element in seismic hazard analysis and earthquake simulations. Source models span a range of conceptual complexity - from simple time-independent point sources to extended fault slip distributions. Further computational complexity arises because the seismological community has established so many source description formats and variations thereof; what this means is that conceptually equivalent source models are often expressed in different ways. Despite the resultant practical difficulties, there exists a rich semantic

208

Page 206: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

vocabulary for working with earthquake sources. For these reasons, we feel it is appropriate to create a semantic model of earthquake sources using an ontology, a computer science tool from the field of knowledge representation.

Unlike the domain of most ontology work to date, earthquake sources can be described by a very precise mathematical framework. Another uniqueness associated with developing such an ontology is that earthquake sources are often used as computational objects. A seismologist generally wants more than to simply construct a source and have it be well-formed and properly described; additionally, the source will be used for performing calculations. Representation and manipulation of complex mathematical objects presents a challenge to the ontology development community.

In order to enable simulations involving many different types of source models, we have completed preliminary development of a seismic point source ontology. The use of an ontology to represent knowledge provides machine interpretability and the ability to validate logical consistency and completeness. Our ontology, encoded using the OWL Web Ontology Language – a standard from the World Wide Web Consortium, contains the conceptual definitions and relationships necessary for source translation services. For example, specification of strike, dip, rake, and seismic moment will automatically translate into a double couple seismic moment tensor.

The seismic point source ontology captures a set of domain-specific knowledge and thus it can serve as the foundation for software tools designed to manipulate seismic sources. We demonstrate this usage through software called Seismic Source Translator (SST), a Java Application Programming Interface (API) accessed via an interactive Graphical User Interface (GUI). This application provides means for constructing a point source representation and translating this source into a number of formats compatible with wave propagation modeling codes.

Fully 3D Waveform Tomography for the L. A. Basin Area Using SCEC/CMEZhao, Li (USC), Po Chen (USC), and Thomas H. Jordan (USC)

A central problem of seismology is the inversion of regional waveform data for models of 3D earth structure. In Southern California, two 3D earth models, the SCEC Community Velocity Model (CVM) of Magistrale et al. (2000) and the Harvard/Caltech model (Komatitsch et al. 2003), are already available, and efficient numerical methods have been developed for solving the forward wave-propagation problem in 3D models. Based on these previous achievements, we have developed a unified inversion procedure to improve 3D earth models as well as recover the finite source properties of local earthquakes (Chen et al, this meeting). Our data are time- and frequency-localized measurements of the phase and amplitude anomalies relative to synthetic seismograms computed in the 3D elastic starting model. The procedure relies on the use of receiver-side strain Green tensors (RSGTs) and source-side earthquake wavefields (SEWs). The RSGTs are the spatial-temporal strain fields produced by the three orthogonal unit impulsive point forces acting at the receiver. The SEWs are the wavefields generated by the actual point earthquake sources. We have constructed a RSGT database for 64 broadband stations in the Los Angeles region using the SCEC CVM and K. Olsen’s finite-difference code. The Fréchet (sensitivity) kernels for our time-frequency-dependent phase and amplitude measurements are computed by convolving the SEWs with the RSGTs. To set up the structural inverse problem, we made about 20,000 phase-delay and amplitude-reduction measurements on 814 P waves, 822 SH waves and 293 SV waves from 72 local small earthquakes (3.0 < ML < 4.8). Using CVM as our starting model and the data and their Fréchet kernels, we have obtained a revised 3D model LABF3D for Los Angeles Basin area. To our knowledge, this is the first “fully 3D” inversion of waveform data for regional earth structure.

209

Page 207: 2004 SCEC Annual Meeting · Web viewThis large, comprehensive document (188 pp., 8.1 MB) provides an excellent snapshot of the Center’s 2004 activities. It comprises following sections:

Detection of Temporally and Spatially Limited Periodic Earthquake Recurrence in Synthetic Seismic Records

Zielke, Olaf (ASU)

The nonlinear dynamics of fault behavior are dominated by complex interactions among the multiple processes controlling this system. For example, temporal and spatial variations in pore pressure, healing effects, and stress transfer cause significant heterogeneities in fault properties and the stress-field at the sub-fault level. Numerical and laboratory fault models show that the interaction of large systems of fault elements causes the entire system to develop into a state of self-organized criticality. Once in this state, small perturbations of the system may result in chain reactions (i.e., earthquakes) which can affect any number of fault segments. This sensitivity to small perturbations is strong evidence for chaotic fault behavior, which implies that exact event prediction is not possible. However, earthquake prediction with a useful accuracy is nevertheless possible.

Studies of other natural chaotic systems have shown that they may enter states of metastability, in which the system’s behavior is predictable. Applying this concept to earthquake faults, these windows of metastable behavior should be characterized by periodic earthquake recurrence. One can argue that the observed periodicity of the Parkfield, CA (M= 6) events resembles such a window of metastability.

I am statistically analyzing numerically generated seismic records to study the existence of these phases of periodic behavior. In this preliminary study, seismic records were generated using a model introduced by Nakanishi [Phys. Rev. A, 43, #12, 6613-6621, 1991]. It consists of a one-dimensional chain of blocks (interconnected by springs) with a relaxation function that mimics velocity-weakened frictional behavior. The earthquakes occurring in this model show a power-law frequency-size distribution as well as clusters of small events that precede larger earthquakes.

I have analyzed time-series of single block motions within the system. These time-series show noticeable periodicity during certain intervals in an otherwise aperiodic record. The periodicity is generally limited to the largest earthquakes. The maximum event size is a function of the systems stiffness and the degree of velocity-weakening. The observed periodic recurrence resembles the concept of the characteristic earthquake model, in that the earthquakes involved occur at the same location, with the same slip distribution. Again, this periodic behavior is basically limited to the largest but also most interesting (i.e., destructive) events occurring in certain time windows. Within these windows of periodic behavior, the prediction of large earthquake becomes a straightforward task. Further studies will attempt to determine the characteristics of onset, duration, and end of these windows of periodic behavior.

210


Recommended