[Pgi-wg] Input for PGI Roadmap Document about Virginia CampusGrid

David Wallom david.wallom at oerc.ox.ac.uk
Tue Mar 17 05:05:00 CDT 2009


Hello Morris,

But you are missing out the unquantified level of funding for the UMD to
which gLite/ARC/UNICORE will certainly have access.

David


On 17/03/2009 09:54, "Morris Riedel" <m.riedel at fz-juelich.de> wrote:

> Hi Andrew, Duane,
> 
>   I know much about DEISA, EGEE, NorduGrid, and more recently also EDGES,
> because they have contributed within GIN - but in terms of the
> GENESIS-II-based production Campus Grid - I would rather like to have some
> background information for the PGI roadmap document:
> 
> (1)
> How many users are approximately using the infrastructure daily.
> 
> 
> (2)
> Which codes are running on which platforms, e.g. CFD codes on parallel
> machines, or rather docking tools for 'embarassingly parallel' PC pool jobs?
> 
> 
> (3)
> Which kind of resources are interconnected - which kind of HPC/HTC resources
> are part of the Virginia Campus Grid infrastructure?
> 
> 
> (4)
> What is the sustainability plan of the Campus Grid in general and GENESIS-II
> in particular?
> 
> In Europe the HPC community see an evolution moving from DEISA to PRACE,
> while EGEE seem to have an evolution towards an EGI. In both is limited
> funding for middleware like UNICORE, gLite. The same is true for ARC that is
> not only used in NorduGrid, but also in Baltic-Grid and so on.
> 
> 
> I think this is also quite valuable information for the whole PGI group.
> 
> Is there a special link to the infrastructure/GENESIS-II Website that act as
> the main contact point for it?
> 
> Many thanks for this!
> 
> Furthermore, I would welcome you in the GIN group in order to evaluate your
> application requirements for those applications that require resources in
> more than one Grid. Many production Grids participate in this group (some
> more or less actively/passively) - so it would nice if you join the GIN
> community!
> 
> Basically, PGI is a result of collaborations undertaken in GIN (and clear
> demands by scientists that would like to use more than one Grid).
>  
> 
> Your co-chair,
> Morris
> 
> 
> 
>   
> 
> ------------------------------------------------------------
> Morris Riedel
> SW - Engineer
> Distributed Systems and Grid Computing Division
> Jülich Supercomputing Centre (JSC)
> Forschungszentrum Juelich
> Wilhelm-Johnen-Str. 1
> D - 52425 Juelich
> Germany
> 
> Email: m.riedel at fz-juelich.de
> Info: http://www.fz-juelich.de/jsc/JSCPeople/riedel
> Phone: +49 2461 61 - 3651
> Fax: +49 2461 61 - 6656
> 
> Skype: MorrisRiedel
> 
> "We work to better ourselves, and the rest of humanity"
> 
> Sitz der Gesellschaft: Jülich
> Eingetragen im Handelsregister des Amtsgerichts Düren Nr. HR B 3498
> Vorsitzende des Aufsichtsrats: MinDirig'in Bärbel Brumme-Bothe
> Vorstand: Prof. Dr. Achim Bachem (Vorsitzender),
> Dr. Ulrich Krafft (stellv. Vorsitzender)
> 
> 
> _______________________________________________
> Pgi-wg mailing list
> Pgi-wg at ogf.org
> http://www.ogf.org/mailman/listinfo/pgi-wg



More information about the Pgi-wg mailing list