0
  • DE
  • EN
  • FR
  • Base de données et galerie internationale d'ouvrages d'art et du génie civil

Publicité

Cloud Accelerated Performance Based Seismic Design

 Cloud Accelerated Performance Based Seismic Design
Auteur(s): , ,
Présenté pendant IABSE Symposium: Engineering the Future, Vancouver, Canada, 21-23 September 2017, publié dans , pp. 670-676
DOI: 10.2749/vancouver.2017.0670
Prix: € 25,00 incl. TVA pour document PDF  
AJOUTER AU PANIER
Télécharger l'aperçu (fichier PDF) 0.19 MB

Non-linear Time History Analysis (NLTHA) is a key enabler of Performance Based Seismic Design (PBSD). Arup Los Angeles office typically performs these simulations in LS-Dyna solver. In order to res...
Lire plus

Détails bibliographiques

Auteur(s): (Arup, Los Angeles, CA)
(Arup, Los Angeles, CA)
(Arup, Los Angeles, CA)
Médium: papier de conférence
Langue(s): anglais
Conférence: IABSE Symposium: Engineering the Future, Vancouver, Canada, 21-23 September 2017
Publié dans:
Page(s): 670-676 Nombre total de pages (du PDF): 7
Page(s): 670-676
Nombre total de pages (du PDF): 7
Année: 2017
DOI: 10.2749/vancouver.2017.0670
Abstrait:

Non-linear Time History Analysis (NLTHA) is a key enabler of Performance Based Seismic Design (PBSD). Arup Los Angeles office typically performs these simulations in LS-Dyna solver. In order to respond to the demands of concurrent design projects, the authors have adopted a cloud centric approach to accelerate our workflows and to enable the use of non-linear time history analysis as a design tool as opposed to a verification tool. This paper will present our custom workflow which enable a dramatic compression of the time required for these analysis. The workflow generates LS- Dyna models in parametric fashion via Rhino- Grasshopper. Since a single design iteration of analysis can result in 48 to 110 models from a range of ground motions and input parameters these models are typically executed on a compute cluster with a large number of compute cores. The resulting number of analyses generates a large amount of data (8-16TB) which we post process leveraging “Big Data” approaches typically used by other industries (financial or retail firms).