The paper presents a study assessing the accuracy of determining the Qibla direction, which indicates the direction of the Kaaba in Mecca, in mosques in Indonesia. The methodology involves using Google Earth to measure the Qibla direction in four mosques and analyzing deviations. Results found deviations in three mosques' directions, indicating Google Earth provides an accurate, easy alternative to traditional methods. The study highlights the tool's applicability for calibrating Qibla directions without extensive astronomy knowledge.
Research Data Infrastructure for Geochemistry (DFG Roundtable)Kerstin Lehnert
This presentation provides an overview of different aspects of data management for geochemistry and resources available at the EarthChem@IEDA data facility.
A presentation given by Manjula Patel (UKOLN) at the Repository Curation Environments (RECURSE) Workshop held at the 4th International Digital Curation Conference, Edinburgh, 1st December 2008,
http://www.dcc.ac.uk/events/dcc-2008/programme/
Cyberinfrastructure to Support Ocean ObservatoriesLarry Smarr
05.03.18
Invited Talk to the Ocean Studies Board
National Research Council
Title: Cyberinfrastructure to Support Ocean Observatories
University of California San Diego
This document proposes an expansion of the GeneLab resource to better collect, analyze, and share spaceflight experimental data. The expansion involves: 1) Adopting a model similar to the Department of Energy's KBase platform to integrate data management, analysis tools, and results sharing for space biology research. 2) Designing a robust graphical user interface to engage the scientific community in accessing, searching, analyzing, and comparing spaceflight data through intuitive tutorials and publication/analysis suggestions. The goal is to maximize the impact of spaceflight experiments by bringing together scattered omics data in a redesigned, quality-controlled platform and increasing data/analysis reuse.
This document outlines plans to develop systems and tools to support the University of Liverpool in meeting requirements for the upcoming Research Excellence Framework (REF), which will replace the Research Assessment Exercise (RAE). It discusses developing an institutional repository for research publications, providing open access to publications, collecting metadata, and creating a searchable "oracle of research excellence." The library will work on migrating data from existing systems, tools for data collection, and redeveloping staff research profiles and other reports. This will help support REF requirements while also providing benefits like increased research visibility and funding.
Lightning Talks: All EartCube Funded ProjectsEarthCube
This document summarizes two projects funded by EarthCube:
1. The first project aims to enable agile and sustainable institutional arrangements to support EarthCube's mission. It will document lessons for similar initiatives and advance organizational theory. Key milestones include stakeholder surveys and facilitating the chartering of EarthCube assemblies.
2. The second project, C4P, aims to advance the role of cyberinfrastructure in paleobioscience studies. It will build partnerships, catalog existing resources, and promote standards. Activities include workshops, an outreach campaign, and developing a steering committee to guide the project.
Mark Yashar's curriculum vitae provides information about his education and employment history. He holds a Ph.D. in Physics from UC Davis and has worked on various research projects involving astrophysics, cosmology, climate modeling, and radio astronomy. His skills include programming languages like Python, C++, and Fortran. He has experience using modeling and analysis tools in areas like atmospheric science, astronomy, and computational physics.
The document discusses the evolution of science and research from the 1940s to present day. It notes Vannevar Bush's 1945 concerns about the growing mountain of research that scientists did not have time to fully understand or remember. It then discusses the current "data explosion" and challenges of accessing, sharing, and building on increasingly large amounts of data and research. The document advocates for reusable, reproducible, and transparent science through connected resources and environments that facilitate collaboration and knowledge sharing.
Research Data Infrastructure for Geochemistry (DFG Roundtable)Kerstin Lehnert
This presentation provides an overview of different aspects of data management for geochemistry and resources available at the EarthChem@IEDA data facility.
A presentation given by Manjula Patel (UKOLN) at the Repository Curation Environments (RECURSE) Workshop held at the 4th International Digital Curation Conference, Edinburgh, 1st December 2008,
http://www.dcc.ac.uk/events/dcc-2008/programme/
Cyberinfrastructure to Support Ocean ObservatoriesLarry Smarr
05.03.18
Invited Talk to the Ocean Studies Board
National Research Council
Title: Cyberinfrastructure to Support Ocean Observatories
University of California San Diego
This document proposes an expansion of the GeneLab resource to better collect, analyze, and share spaceflight experimental data. The expansion involves: 1) Adopting a model similar to the Department of Energy's KBase platform to integrate data management, analysis tools, and results sharing for space biology research. 2) Designing a robust graphical user interface to engage the scientific community in accessing, searching, analyzing, and comparing spaceflight data through intuitive tutorials and publication/analysis suggestions. The goal is to maximize the impact of spaceflight experiments by bringing together scattered omics data in a redesigned, quality-controlled platform and increasing data/analysis reuse.
This document outlines plans to develop systems and tools to support the University of Liverpool in meeting requirements for the upcoming Research Excellence Framework (REF), which will replace the Research Assessment Exercise (RAE). It discusses developing an institutional repository for research publications, providing open access to publications, collecting metadata, and creating a searchable "oracle of research excellence." The library will work on migrating data from existing systems, tools for data collection, and redeveloping staff research profiles and other reports. This will help support REF requirements while also providing benefits like increased research visibility and funding.
Lightning Talks: All EartCube Funded ProjectsEarthCube
This document summarizes two projects funded by EarthCube:
1. The first project aims to enable agile and sustainable institutional arrangements to support EarthCube's mission. It will document lessons for similar initiatives and advance organizational theory. Key milestones include stakeholder surveys and facilitating the chartering of EarthCube assemblies.
2. The second project, C4P, aims to advance the role of cyberinfrastructure in paleobioscience studies. It will build partnerships, catalog existing resources, and promote standards. Activities include workshops, an outreach campaign, and developing a steering committee to guide the project.
Mark Yashar's curriculum vitae provides information about his education and employment history. He holds a Ph.D. in Physics from UC Davis and has worked on various research projects involving astrophysics, cosmology, climate modeling, and radio astronomy. His skills include programming languages like Python, C++, and Fortran. He has experience using modeling and analysis tools in areas like atmospheric science, astronomy, and computational physics.
The document discusses the evolution of science and research from the 1940s to present day. It notes Vannevar Bush's 1945 concerns about the growing mountain of research that scientists did not have time to fully understand or remember. It then discusses the current "data explosion" and challenges of accessing, sharing, and building on increasingly large amounts of data and research. The document advocates for reusable, reproducible, and transparent science through connected resources and environments that facilitate collaboration and knowledge sharing.
EarthCube's OceanLink - Project Overview and Presentation Updates (March 2014)EarthCube
EAGER: Collaborative Research: EarthCube Building Blocks, Leveraging Semantics and Linked Data for Geoscience Data Sharing and Discovery or "OceanLink" is one of 15 EarthCube-funded components.
This presentation includes an OceanLink Project Overview (slides 1-12), followed by several presentations highlighting separate project efforts and updates to different audiences:
Slide 13: "Ontologies in a data-driven world." Montana State University Computer Science Department, March 3, 2014.
Slide 44: "Towards ontology patterns for ocean science repository integration", Ontology Summit 2014, Ontolog online session January 2014.
Slide 82: OceanLink: Using Patterns for Discovery in EarthCube, GeoVoCampSB2014, Santa Barbara, March, 2014
Slide 118: "Ontologies in a data driven world," IBM T.J. Watson Research Center, January 2014.
Understanding the Big Picture of e-ScienceAndrew Sallans
E-science involves large-scale collaborative research enabled by new technologies like high-speed networks and cheap data storage. It produces massive amounts of complex data from areas like climate modeling, particle physics experiments, biomedical research grids, and citizen science projects. This represents a major change for research that requires new infrastructure, expertise, and approaches. Universities like UVA are responding by establishing research computing support services in their libraries to help scientists with the computational and data aspects of e-science throughout the research lifecycle.
Integration of oreChem with the eCrystals repository for crystal structuresMark Borkum
This document discusses integrating the oreChem ontology for representing scientific experiments and their provenance with the eCrystals repository for crystallography data. It describes how eCrystals represents crystallography experiments and data, the motivation for open access to research, and the oreChem ontology's concepts for representing methodologies, enactments, and provenance. It then outlines a proposed plugin for eCrystals that would map eCrystals data to oreChem concepts to capture the provenance of experiments and link data to the methods used.
The document describes the eBank UK project, which seeks to link e-research data, scholarly communication, and e-learning by building connections from data generated in experiments through publications and into educational resources. It discusses the scholarly knowledge cycle and how eBank UK is addressing the bottleneck of data publication by developing a distributed information architecture with common data standards and ontologies. This will allow an aggregator to harvest metadata from repositories holding experimental data and publications and provide a single access point for discovery across distributed resources through services like search and retrieval.
Citing and reading behaviours in high energy physics.Proyecto CeVALE2
This document analyzes citation and reading behaviors in the field of high-energy physics (HEP). It finds that:
1) There is a large citation advantage for HEP papers that are freely available as preprints online, as demonstrated by an analysis of citations in the SPIRES database.
2) No discernible citation advantage was found for publishing papers in open access journals compared to non-open access journals.
3) An analysis of usage logs in the SPIRES digital library shows that HEP scientists rarely read journals and prefer accessing preprints instead.
Supporting the research lifecycle of geo-GSNL initiative through HPC and Rese...Raul Palma
Volcanic eruptions are among the most spectacular and dangerous phenomena on Earth, capable of generating disasters at various scales. The Geohazard Supersites and Natural Laboratories initiatives (GSNL) today is a network of 11 Supersites, including volcanoes and seismic areas. Complex algorithms are used to analyse these data and important information on the volcano activity. In addition to computing power and resources, researchers from the geo-gnsl community, as many other data-intensive science communities, are calling for innovative ways to manage their data, methods and other resources, which can enhance the visibility of scientific breakthroughs, encourage reuse, and foster a broader research accessibility. In this contribution we present the results of EVER-EST project (H2020-EINFRA-2015-1), in which we created in collaboration with different partners a virtual research environment (VRE) for Earth Science (https://vre.ever-est.eu/), embracing the research object concept and technologies at its core.
Strong field science core proposal for uph ill siteahsanrabbani
This document provides a 3-year project summary for an international collaboration on strong field science:
1. The project aims to strengthen India's participation in experiments studying matter under extreme light intensities through international collaboration.
2. Indian researchers will perform experiments at international facilities and host foreign researchers, while also building expertise for potential future domestic facilities.
3. The project aims to consolidate Indian efforts in this area and increase international exposure, with a goal of developing robust Indian competence in studying matter with petawatt laser pulses.
The role of biodiversity informatics in GBIF, 2021-05-18Dag Endresen
The document discusses the role of biodiversity informatics and the Global Biodiversity Information Facility (GBIF) in making biodiversity data available through open access. GBIF provides free and open access to over 1.6 billion species occurrence records from over 1600 data publishers. The document highlights how digitizing natural history collections and integrating diverse biodiversity data sources can support research and policy goals. It emphasizes best practices like using common data standards, publishing datasets on GBIF to make them widely discoverable and reusable, and citing data with DOIs to incentivize open data sharing.
The Materials Project: Applications to energy storage and functional materia...Anubhav Jain
The Materials Project is a free online database containing calculated properties of over 150,000 materials designed to help researchers discover new functional materials. It has been used extensively in academia and industry to identify novel battery electrode materials and solid electrolytes through high-throughput computational screening. Researchers are now using the Materials Project dataset to train machine learning models to predict battery properties and screen for new materials. Related efforts aim to bridge the gap between computational design and physical synthesis by developing an automated synthesis lab to experimentally validate candidate materials identified from the database.
Jolene Robin-McCaskill is seeking a position combining her expertise in petrophysics, geophysics, engineering and computer science. She has a PhD in Geophysics from Stanford University and industry experience at ConocoPhillips processing seismic and well log data. Her background includes experimental design, data analysis and user documentation across academia, government and private sectors.
The Department of Energy's Integrated Research Infrastructure (IRI)Globus
We will provide an overview of DOE’s IRI initiative as it moves into early implementation, what drives the IRI vision, and the role of DOE in the larger national research ecosystem.
Toward a Global Interactive Earth Observing CyberinfrastructureLarry Smarr
The document discusses the need for a new generation of cyberinfrastructure to support interactive global earth observation. It outlines several prototyping projects that are building examples of systems enabling real-time control of remote instruments, remote data access and analysis. These projects are driving the development of an emerging cyber-architecture using web and grid services to link distributed data repositories and simulations.
LambdaGrids--Earth and Planetary Sciences Driving High Performance Networks a...Larry Smarr
05.02.04
Invited Talk to the NASA Jet Propulsion Laboratory
Title: LambdaGrids--Earth and Planetary Sciences Driving High Performance Networks and High Resolution Visualizations
Pasadena, CA
Data Facilities Workshop - Panel on Current Concepts in Data Sharing & Intero...EarthCube
This series of presentations was given at the EarthCube Data Facilities End-User Workshop held January 15-17, 2014 in Washington, DC. This workshop provided a forum to discuss the unique requirements and challenges associated with developing the communication, collaboration, interoperability, and governance structures that will be required to build EarthCube in conjunction with existing and emerging NSF/GEO facilities.
This panel and discussion, specifically, outlined and explained several current concepts in data sharing and interoperability, featuring presentations by:
Paul Morin (UMN): Polar Cyberinfrastructure
Don Middleton (UCAR): Atmospheric/Climate
Kerstin Lehnert (LDEO): Domain Repositories & Physical Samples
David Schindel (CBOL, GRBio): Biological Perspective & Collections
Hank Leoscher (NEON): Observation Networks
Daniel Fuka (Virginia Tech) and Ruth Duerr (NSIDC): Brokering
Ilya Zaslavsky (UCSD): Cross-Domain Interoperability
This document provides an introduction and update on the Earth Science Knowledge Graph (ESKG) project. The ESKG is building a knowledge graph from heterogeneous Earth science data sources to improve data discovery. It summarizes the goals and architecture of ESKG, provides an update on building the PO.DAAC Datasets Ontology, and outlines future plans which include integrating additional data sources and presenting ESKG at conferences.
Keynote presentation delivered at ELAG 2013 in Gent, Belgium, on May 29 2013. Discusses Research Objects and the relationship to work my team has been involved in during the past couple of years: OAI-ORE, Open Annotation, Memento.
Aspects of Reproducibility in Earth ScienceRaul Palma
The document discusses aspects of reproducibility in earth science research within the European Virtual Environment for Research - Earth Science Themes (EVEREST) project. The key objectives of EVEREST are to establish an e-infrastructure to facilitate collaborative earth science research through shared data, models, and workflows. Research Objects (ROs) will be used to capture and share workflows, processes, and results to help ensure reproducibility and preservation of earth science research. An example RO is described for mapping volcano deformation using satellite imagery and other data sources. Issues around reproducibility related to data access, software dependencies, and manual intervention in workflows are also discussed.
Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024Sinan KOZAK
Sinan from the Delivery Hero mobile infrastructure engineering team shares a deep dive into performance acceleration with Gradle build cache optimizations. Sinan shares their journey into solving complex build-cache problems that affect Gradle builds. By understanding the challenges and solutions found in our journey, we aim to demonstrate the possibilities for faster builds. The case study reveals how overlapping outputs and cache misconfigurations led to significant increases in build times, especially as the project scaled up with numerous modules using Paparazzi tests. The journey from diagnosing to defeating cache issues offers invaluable lessons on maintaining cache integrity without sacrificing functionality.
Null Bangalore | Pentesters Approach to AWS IAMDivyanshu
#Abstract:
- Learn more about the real-world methods for auditing AWS IAM (Identity and Access Management) as a pentester. So let us proceed with a brief discussion of IAM as well as some typical misconfigurations and their potential exploits in order to reinforce the understanding of IAM security best practices.
- Gain actionable insights into AWS IAM policies and roles, using hands on approach.
#Prerequisites:
- Basic understanding of AWS services and architecture
- Familiarity with cloud security concepts
- Experience using the AWS Management Console or AWS CLI.
- For hands on lab create account on [killercoda.com](https://killercoda.com/cloudsecurity-scenario/)
# Scenario Covered:
- Basics of IAM in AWS
- Implementing IAM Policies with Least Privilege to Manage S3 Bucket
- Objective: Create an S3 bucket with least privilege IAM policy and validate access.
- Steps:
- Create S3 bucket.
- Attach least privilege policy to IAM user.
- Validate access.
- Exploiting IAM PassRole Misconfiguration
-Allows a user to pass a specific IAM role to an AWS service (ec2), typically used for service access delegation. Then exploit PassRole Misconfiguration granting unauthorized access to sensitive resources.
- Objective: Demonstrate how a PassRole misconfiguration can grant unauthorized access.
- Steps:
- Allow user to pass IAM role to EC2.
- Exploit misconfiguration for unauthorized access.
- Access sensitive resources.
- Exploiting IAM AssumeRole Misconfiguration with Overly Permissive Role
- An overly permissive IAM role configuration can lead to privilege escalation by creating a role with administrative privileges and allow a user to assume this role.
- Objective: Show how overly permissive IAM roles can lead to privilege escalation.
- Steps:
- Create role with administrative privileges.
- Allow user to assume the role.
- Perform administrative actions.
- Differentiation between PassRole vs AssumeRole
Try at [killercoda.com](https://killercoda.com/cloudsecurity-scenario/)
Weitere ähnliche Inhalte
Ähnlich wie PRESENTATION ON ASTRONOMICAL IMAGERY PAPER REVIEW.pdf
EarthCube's OceanLink - Project Overview and Presentation Updates (March 2014)EarthCube
EAGER: Collaborative Research: EarthCube Building Blocks, Leveraging Semantics and Linked Data for Geoscience Data Sharing and Discovery or "OceanLink" is one of 15 EarthCube-funded components.
This presentation includes an OceanLink Project Overview (slides 1-12), followed by several presentations highlighting separate project efforts and updates to different audiences:
Slide 13: "Ontologies in a data-driven world." Montana State University Computer Science Department, March 3, 2014.
Slide 44: "Towards ontology patterns for ocean science repository integration", Ontology Summit 2014, Ontolog online session January 2014.
Slide 82: OceanLink: Using Patterns for Discovery in EarthCube, GeoVoCampSB2014, Santa Barbara, March, 2014
Slide 118: "Ontologies in a data driven world," IBM T.J. Watson Research Center, January 2014.
Understanding the Big Picture of e-ScienceAndrew Sallans
E-science involves large-scale collaborative research enabled by new technologies like high-speed networks and cheap data storage. It produces massive amounts of complex data from areas like climate modeling, particle physics experiments, biomedical research grids, and citizen science projects. This represents a major change for research that requires new infrastructure, expertise, and approaches. Universities like UVA are responding by establishing research computing support services in their libraries to help scientists with the computational and data aspects of e-science throughout the research lifecycle.
Integration of oreChem with the eCrystals repository for crystal structuresMark Borkum
This document discusses integrating the oreChem ontology for representing scientific experiments and their provenance with the eCrystals repository for crystallography data. It describes how eCrystals represents crystallography experiments and data, the motivation for open access to research, and the oreChem ontology's concepts for representing methodologies, enactments, and provenance. It then outlines a proposed plugin for eCrystals that would map eCrystals data to oreChem concepts to capture the provenance of experiments and link data to the methods used.
The document describes the eBank UK project, which seeks to link e-research data, scholarly communication, and e-learning by building connections from data generated in experiments through publications and into educational resources. It discusses the scholarly knowledge cycle and how eBank UK is addressing the bottleneck of data publication by developing a distributed information architecture with common data standards and ontologies. This will allow an aggregator to harvest metadata from repositories holding experimental data and publications and provide a single access point for discovery across distributed resources through services like search and retrieval.
Citing and reading behaviours in high energy physics.Proyecto CeVALE2
This document analyzes citation and reading behaviors in the field of high-energy physics (HEP). It finds that:
1) There is a large citation advantage for HEP papers that are freely available as preprints online, as demonstrated by an analysis of citations in the SPIRES database.
2) No discernible citation advantage was found for publishing papers in open access journals compared to non-open access journals.
3) An analysis of usage logs in the SPIRES digital library shows that HEP scientists rarely read journals and prefer accessing preprints instead.
Supporting the research lifecycle of geo-GSNL initiative through HPC and Rese...Raul Palma
Volcanic eruptions are among the most spectacular and dangerous phenomena on Earth, capable of generating disasters at various scales. The Geohazard Supersites and Natural Laboratories initiatives (GSNL) today is a network of 11 Supersites, including volcanoes and seismic areas. Complex algorithms are used to analyse these data and important information on the volcano activity. In addition to computing power and resources, researchers from the geo-gnsl community, as many other data-intensive science communities, are calling for innovative ways to manage their data, methods and other resources, which can enhance the visibility of scientific breakthroughs, encourage reuse, and foster a broader research accessibility. In this contribution we present the results of EVER-EST project (H2020-EINFRA-2015-1), in which we created in collaboration with different partners a virtual research environment (VRE) for Earth Science (https://vre.ever-est.eu/), embracing the research object concept and technologies at its core.
Strong field science core proposal for uph ill siteahsanrabbani
This document provides a 3-year project summary for an international collaboration on strong field science:
1. The project aims to strengthen India's participation in experiments studying matter under extreme light intensities through international collaboration.
2. Indian researchers will perform experiments at international facilities and host foreign researchers, while also building expertise for potential future domestic facilities.
3. The project aims to consolidate Indian efforts in this area and increase international exposure, with a goal of developing robust Indian competence in studying matter with petawatt laser pulses.
The role of biodiversity informatics in GBIF, 2021-05-18Dag Endresen
The document discusses the role of biodiversity informatics and the Global Biodiversity Information Facility (GBIF) in making biodiversity data available through open access. GBIF provides free and open access to over 1.6 billion species occurrence records from over 1600 data publishers. The document highlights how digitizing natural history collections and integrating diverse biodiversity data sources can support research and policy goals. It emphasizes best practices like using common data standards, publishing datasets on GBIF to make them widely discoverable and reusable, and citing data with DOIs to incentivize open data sharing.
The Materials Project: Applications to energy storage and functional materia...Anubhav Jain
The Materials Project is a free online database containing calculated properties of over 150,000 materials designed to help researchers discover new functional materials. It has been used extensively in academia and industry to identify novel battery electrode materials and solid electrolytes through high-throughput computational screening. Researchers are now using the Materials Project dataset to train machine learning models to predict battery properties and screen for new materials. Related efforts aim to bridge the gap between computational design and physical synthesis by developing an automated synthesis lab to experimentally validate candidate materials identified from the database.
Jolene Robin-McCaskill is seeking a position combining her expertise in petrophysics, geophysics, engineering and computer science. She has a PhD in Geophysics from Stanford University and industry experience at ConocoPhillips processing seismic and well log data. Her background includes experimental design, data analysis and user documentation across academia, government and private sectors.
The Department of Energy's Integrated Research Infrastructure (IRI)Globus
We will provide an overview of DOE’s IRI initiative as it moves into early implementation, what drives the IRI vision, and the role of DOE in the larger national research ecosystem.
Toward a Global Interactive Earth Observing CyberinfrastructureLarry Smarr
The document discusses the need for a new generation of cyberinfrastructure to support interactive global earth observation. It outlines several prototyping projects that are building examples of systems enabling real-time control of remote instruments, remote data access and analysis. These projects are driving the development of an emerging cyber-architecture using web and grid services to link distributed data repositories and simulations.
LambdaGrids--Earth and Planetary Sciences Driving High Performance Networks a...Larry Smarr
05.02.04
Invited Talk to the NASA Jet Propulsion Laboratory
Title: LambdaGrids--Earth and Planetary Sciences Driving High Performance Networks and High Resolution Visualizations
Pasadena, CA
Data Facilities Workshop - Panel on Current Concepts in Data Sharing & Intero...EarthCube
This series of presentations was given at the EarthCube Data Facilities End-User Workshop held January 15-17, 2014 in Washington, DC. This workshop provided a forum to discuss the unique requirements and challenges associated with developing the communication, collaboration, interoperability, and governance structures that will be required to build EarthCube in conjunction with existing and emerging NSF/GEO facilities.
This panel and discussion, specifically, outlined and explained several current concepts in data sharing and interoperability, featuring presentations by:
Paul Morin (UMN): Polar Cyberinfrastructure
Don Middleton (UCAR): Atmospheric/Climate
Kerstin Lehnert (LDEO): Domain Repositories & Physical Samples
David Schindel (CBOL, GRBio): Biological Perspective & Collections
Hank Leoscher (NEON): Observation Networks
Daniel Fuka (Virginia Tech) and Ruth Duerr (NSIDC): Brokering
Ilya Zaslavsky (UCSD): Cross-Domain Interoperability
This document provides an introduction and update on the Earth Science Knowledge Graph (ESKG) project. The ESKG is building a knowledge graph from heterogeneous Earth science data sources to improve data discovery. It summarizes the goals and architecture of ESKG, provides an update on building the PO.DAAC Datasets Ontology, and outlines future plans which include integrating additional data sources and presenting ESKG at conferences.
Keynote presentation delivered at ELAG 2013 in Gent, Belgium, on May 29 2013. Discusses Research Objects and the relationship to work my team has been involved in during the past couple of years: OAI-ORE, Open Annotation, Memento.
Aspects of Reproducibility in Earth ScienceRaul Palma
The document discusses aspects of reproducibility in earth science research within the European Virtual Environment for Research - Earth Science Themes (EVEREST) project. The key objectives of EVEREST are to establish an e-infrastructure to facilitate collaborative earth science research through shared data, models, and workflows. Research Objects (ROs) will be used to capture and share workflows, processes, and results to help ensure reproducibility and preservation of earth science research. An example RO is described for mapping volcano deformation using satellite imagery and other data sources. Issues around reproducibility related to data access, software dependencies, and manual intervention in workflows are also discussed.
Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024Sinan KOZAK
Sinan from the Delivery Hero mobile infrastructure engineering team shares a deep dive into performance acceleration with Gradle build cache optimizations. Sinan shares their journey into solving complex build-cache problems that affect Gradle builds. By understanding the challenges and solutions found in our journey, we aim to demonstrate the possibilities for faster builds. The case study reveals how overlapping outputs and cache misconfigurations led to significant increases in build times, especially as the project scaled up with numerous modules using Paparazzi tests. The journey from diagnosing to defeating cache issues offers invaluable lessons on maintaining cache integrity without sacrificing functionality.
Null Bangalore | Pentesters Approach to AWS IAMDivyanshu
#Abstract:
- Learn more about the real-world methods for auditing AWS IAM (Identity and Access Management) as a pentester. So let us proceed with a brief discussion of IAM as well as some typical misconfigurations and their potential exploits in order to reinforce the understanding of IAM security best practices.
- Gain actionable insights into AWS IAM policies and roles, using hands on approach.
#Prerequisites:
- Basic understanding of AWS services and architecture
- Familiarity with cloud security concepts
- Experience using the AWS Management Console or AWS CLI.
- For hands on lab create account on [killercoda.com](https://killercoda.com/cloudsecurity-scenario/)
# Scenario Covered:
- Basics of IAM in AWS
- Implementing IAM Policies with Least Privilege to Manage S3 Bucket
- Objective: Create an S3 bucket with least privilege IAM policy and validate access.
- Steps:
- Create S3 bucket.
- Attach least privilege policy to IAM user.
- Validate access.
- Exploiting IAM PassRole Misconfiguration
-Allows a user to pass a specific IAM role to an AWS service (ec2), typically used for service access delegation. Then exploit PassRole Misconfiguration granting unauthorized access to sensitive resources.
- Objective: Demonstrate how a PassRole misconfiguration can grant unauthorized access.
- Steps:
- Allow user to pass IAM role to EC2.
- Exploit misconfiguration for unauthorized access.
- Access sensitive resources.
- Exploiting IAM AssumeRole Misconfiguration with Overly Permissive Role
- An overly permissive IAM role configuration can lead to privilege escalation by creating a role with administrative privileges and allow a user to assume this role.
- Objective: Show how overly permissive IAM roles can lead to privilege escalation.
- Steps:
- Create role with administrative privileges.
- Allow user to assume the role.
- Perform administrative actions.
- Differentiation between PassRole vs AssumeRole
Try at [killercoda.com](https://killercoda.com/cloudsecurity-scenario/)
Introduction- e - waste – definition - sources of e-waste– hazardous substances in e-waste - effects of e-waste on environment and human health- need for e-waste management– e-waste handling rules - waste minimization techniques for managing e-waste – recycling of e-waste - disposal treatment methods of e- waste – mechanism of extraction of precious metal from leaching solution-global Scenario of E-waste – E-waste in India- case studies.
The CBC machine is a common diagnostic tool used by doctors to measure a patient's red blood cell count, white blood cell count and platelet count. The machine uses a small sample of the patient's blood, which is then placed into special tubes and analyzed. The results of the analysis are then displayed on a screen for the doctor to review. The CBC machine is an important tool for diagnosing various conditions, such as anemia, infection and leukemia. It can also help to monitor a patient's response to treatment.
Redefining brain tumor segmentation: a cutting-edge convolutional neural netw...IJECEIAES
Medical image analysis has witnessed significant advancements with deep learning techniques. In the domain of brain tumor segmentation, the ability to
precisely delineate tumor boundaries from magnetic resonance imaging (MRI)
scans holds profound implications for diagnosis. This study presents an ensemble convolutional neural network (CNN) with transfer learning, integrating
the state-of-the-art Deeplabv3+ architecture with the ResNet18 backbone. The
model is rigorously trained and evaluated, exhibiting remarkable performance
metrics, including an impressive global accuracy of 99.286%, a high-class accuracy of 82.191%, a mean intersection over union (IoU) of 79.900%, a weighted
IoU of 98.620%, and a Boundary F1 (BF) score of 83.303%. Notably, a detailed comparative analysis with existing methods showcases the superiority of
our proposed model. These findings underscore the model’s competence in precise brain tumor localization, underscoring its potential to revolutionize medical
image analysis and enhance healthcare outcomes. This research paves the way
for future exploration and optimization of advanced CNN models in medical
imaging, emphasizing addressing false positives and resource efficiency.
Use PyCharm for remote debugging of WSL on a Windo cf5c162d672e4e58b4dde5d797...shadow0702a
This document serves as a comprehensive step-by-step guide on how to effectively use PyCharm for remote debugging of the Windows Subsystem for Linux (WSL) on a local Windows machine. It meticulously outlines several critical steps in the process, starting with the crucial task of enabling permissions, followed by the installation and configuration of WSL.
The guide then proceeds to explain how to set up the SSH service within the WSL environment, an integral part of the process. Alongside this, it also provides detailed instructions on how to modify the inbound rules of the Windows firewall to facilitate the process, ensuring that there are no connectivity issues that could potentially hinder the debugging process.
The document further emphasizes on the importance of checking the connection between the Windows and WSL environments, providing instructions on how to ensure that the connection is optimal and ready for remote debugging.
It also offers an in-depth guide on how to configure the WSL interpreter and files within the PyCharm environment. This is essential for ensuring that the debugging process is set up correctly and that the program can be run effectively within the WSL terminal.
Additionally, the document provides guidance on how to set up breakpoints for debugging, a fundamental aspect of the debugging process which allows the developer to stop the execution of their code at certain points and inspect their program at those stages.
Finally, the document concludes by providing a link to a reference blog. This blog offers additional information and guidance on configuring the remote Python interpreter in PyCharm, providing the reader with a well-rounded understanding of the process.
Comparative analysis between traditional aquaponics and reconstructed aquapon...bijceesjournal
The aquaponic system of planting is a method that does not require soil usage. It is a method that only needs water, fish, lava rocks (a substitute for soil), and plants. Aquaponic systems are sustainable and environmentally friendly. Its use not only helps to plant in small spaces but also helps reduce artificial chemical use and minimizes excess water use, as aquaponics consumes 90% less water than soil-based gardening. The study applied a descriptive and experimental design to assess and compare conventional and reconstructed aquaponic methods for reproducing tomatoes. The researchers created an observation checklist to determine the significant factors of the study. The study aims to determine the significant difference between traditional aquaponics and reconstructed aquaponics systems propagating tomatoes in terms of height, weight, girth, and number of fruits. The reconstructed aquaponics system’s higher growth yield results in a much more nourished crop than the traditional aquaponics system. It is superior in its number of fruits, height, weight, and girth measurement. Moreover, the reconstructed aquaponics system is proven to eliminate all the hindrances present in the traditional aquaponics system, which are overcrowding of fish, algae growth, pest problems, contaminated water, and dead fish.
Embedded machine learning-based road conditions and driving behavior monitoringIJECEIAES
Car accident rates have increased in recent years, resulting in losses in human lives, properties, and other financial costs. An embedded machine learning-based system is developed to address this critical issue. The system can monitor road conditions, detect driving patterns, and identify aggressive driving behaviors. The system is based on neural networks trained on a comprehensive dataset of driving events, driving styles, and road conditions. The system effectively detects potential risks and helps mitigate the frequency and impact of accidents. The primary goal is to ensure the safety of drivers and vehicles. Collecting data involved gathering information on three key road events: normal street and normal drive, speed bumps, circular yellow speed bumps, and three aggressive driving actions: sudden start, sudden stop, and sudden entry. The gathered data is processed and analyzed using a machine learning system designed for limited power and memory devices. The developed system resulted in 91.9% accuracy, 93.6% precision, and 92% recall. The achieved inference time on an Arduino Nano 33 BLE Sense with a 32-bit CPU running at 64 MHz is 34 ms and requires 2.6 kB peak RAM and 139.9 kB program flash memory, making it suitable for resource-constrained embedded systems.
International Conference on NLP, Artificial Intelligence, Machine Learning an...gerogepatton
International Conference on NLP, Artificial Intelligence, Machine Learning and Applications (NLAIM 2024) offers a premier global platform for exchanging insights and findings in the theory, methodology, and applications of NLP, Artificial Intelligence, Machine Learning, and their applications. The conference seeks substantial contributions across all key domains of NLP, Artificial Intelligence, Machine Learning, and their practical applications, aiming to foster both theoretical advancements and real-world implementations. With a focus on facilitating collaboration between researchers and practitioners from academia and industry, the conference serves as a nexus for sharing the latest developments in the field.
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressionsVictor Morales
K8sGPT is a tool that analyzes and diagnoses Kubernetes clusters. This presentation was used to share the requirements and dependencies to deploy K8sGPT in a local environment.
4. Mosca vol I -Fisica-Tipler-5ta-Edicion-Vol-1.pdf
PRESENTATION ON ASTRONOMICAL IMAGERY PAPER REVIEW.pdf
1. PAPER REVIEW
NAME: ABDURRASHEED ALGHAZALEE OLUWASEGUN
DEPARTMENT OF SURVEYING AND GEOINFORMATIC
06/01/2024
MATRIC NO: 190405054
LECTURER: DR. HAMID MOSAKU
PROCESSING ASTRONOMY IMAGERY
USING BIG DATA TECHNOLOGY
ON
2. NAME OF JOURNAL: 2015 IEEE
International Conference on Big Data (Big
Data).
NAME OF PUBLISHER : IEEE Institute of
Electrical and Electronics Engineers (IEEE).
JOURNAL CITATION:
10.1109/bigdata.2015.7363840
JOURNAL TITLE: Scientific computing meets
big data technology: An astronomy use case.
Zhao Zhang (Universty of Californa, Berkeley).
Kyle Barbary ( Berkeley Center for Cosmoogical Physics and
the Berkeley Institute for
DataScience.)
Frank Austin Nothaft ( PhD Student in computer science at
UC Berkeley )
Evan Sparks: PhD Student in computer science at UC
Berkeley
Oliver Zahn: The Executive Director of Berkeley Center for
Cosmological Physics
Michael J. Franklin: Department of Computer Science at
University of Chicago.
David A. Patterson:Professor of Computer Science at UC
Berkeley.
Saul Perlmutter: Professor of physics at UC Berkeley.
AUTHORS
3. 1 — ABSTRCT
2 — PROBLEM STATEMENTS
3 — METHODOLOGY
4 — MAJOR CONTRIBUTION
6 — REWRITING THE PAPER
5 — CRITIQUE OF THE PAPER
4. Issues or Challenges Addressed:
The paper addresses the
substantial challenges arising from
dramatic increases in dataset sizes
across scientific disciplines,
including astronomy, genomics,
social sciences, and neuroscience.
This surge in data has become a
significant bottleneck for research,
particularly when utilizing programs
optimized for small datasets on
single-node workstations. To tackle
this bottleneck, researchers
commonly transition to distributed
processing frameworks, such as C
or Fortran programs, to enhance
processing capacity.
ABSTRACT
GOAL and AIM
The paper underscores the crucial need to adapt to these
evolving data processing demands, emphasizing the transition
from traditional single-node optimized programs to distributed
frameworks and the associated challenges in maintaining
efficiency and speed in scientific research.
METHODOLOGY
Instead of relying on traditional High-Performance Computing
(HPC) software stack tools, the study adopts Apache Spark, a
modern big data platform. The authors introduce Kira, a
flexible and distributed astronomy image processing toolkit
built on Apache Spark, and implement a Source Extractor
application (Kira SE) to serve as a use case.
5. PROBLEM STATEMENT
In a typical night sky survey produces 13 TB of data of
astronomical imagery processing such data has
become a major bottleneck for scientific research
especially in astronomy.
Over the planned 10-year project, the survey is
expected produce 60 PB of raw data, which will be
consolidated into a 15 PB catalog. This timely
processing requirement and the massive amount of
data provides a challenging throughput requirement
for the processing pipeline.
Make work in space work for you
1PETA BYTE(PB):
1,125,899,906,842,624 Bytes.
One quadrillion, one hundred twenty-
five trillion, eight hundred ninety-
nine billion, nine hundred six million,
eight hundred forty-two thousand, six
hundred twenty-four.
NOTE
4k movie is 100GB of data. This would mean
1 petabyte of storage could hold
11,000 4k
6. Utilizing Apache Spark in scientific
analysis pipelines for diverse
tasks especially astronomical
imagery processing, showcasing
its superiority compared to
conventional map-reduce
frameworks such as Google's
MapReduce and Apache Hadoop
MapReduce.
It acknowledges the flexibility
limitations and performance
issues of the map-reduce API,
emphasizing the need for second-
generation map-reduce execution
systems like Apache Spark.
METHODOLOGY
10. MAJOR CONTRIBUTION
The paper on the design and implementation of the Kira
astronomy image processing toolkit makes several notable
contributions, primarily focusing on computational
performance improvement, cost-efficient I/O operations,
and the provision of a flexible programming interface to
facilitate code reuse.
11. CRITIQUE OF THE PAPER
Notable areas for improvement.
Limited Discussion on Integration Challenges:
The paper falls short in addressing the challenges and trade-
offs encountered during the integration with Apache Spark,
particularly in the context of distributed file systems and
parallel processing. A more in-depth exploration of
integration complexities would offer valuable insights for
both researchers and practitioners, enhancing the paper's
overall contribution.
Wider Range of Audience Consideration:
The publication in an Astronomical Journal lacks
consideration for a broader audience. The terminologies,
lexicon, and structural aspects of word usage are not explicit
enough for public consumption. To broaden the appeal, the
paper should incorporate language and explanations
accessible to a wider audience without compromising the
depth of scientific content.
Limited Discussion on Flexibility and Code Reuse:
While highlighting the importance of a flexible
programming interface for code reuse, the paper
misses the opportunity to address ethical regulations
governing code reuse. Clarifying the permissibility
and ethical considerations in various locations or
areas would enhance the paper's completeness and
provide valuable guidance.
4) Lack of Quantitative Details and Real-Life Case
Studies:
The paper lacks quantitative details, especially in
comparing Kira's SE processing of data with a real-
life scientific data processing scenario. Including a
comprehensive case study that compares the toolkit's
performance in a real-world scientific data
processing context would strengthen the paper's
credibility and practical relevance.
12. REWRITING THE PAPER
I would have criticized its publication in an
Astronomical Journal for neglecting a
broader audience, with terminologies and
structure not explicit for public
understanding.
I would have emphasized the importance of
incorporating accessible language while
maintaining scientific depth.
I would have noted that the discussion on
flexibility and code reuse overlooked
ethical regulations, requiring clarification
on permissibility and ethical
considerations.
I would have highlighted that the paper
lacked quantitative details and real-life
case studies, necessitating a comparison
between Kira's SE data processing and
real-world scientific scenarios for
enhanced credibility and practical
relevance.
13. PAPER REVIEW
NAME: ABDURRASHEED ALGHAZALEE OLUWASEGUN
DEPARTMENT OF SURVEYING AND GEOINFORMATIC
06/01/2024
MATRIC NO: 190405054
LECTURER: DR. HAMID MOSAKU
QIBLA DIRECTION ACCURACY ANALYSIS
BASED ON ASTRONOMY (GOOGLE
EARTH), PERSPECTIVE OF ISLAMIC LAW
ON
14. NAME OF JOURNAL:
Journal of Islam and Science
NAME OF PUBLISHER :Institute of Research
and Community Services (LP2M)
Universitas Islam Negeri Alauddin Makassar
JOURNAL CITATION:
https://doi.org/10.24252/jis.v9i1.30111
JOURNAL TITLE: QIBLA DIRECTION
ACCURACY ANALYSIS BASED ON
ASTRONOMY (GOOGLE EARTH),
PERSPECTIVE OF ISLAMIC LAW
Sri Wahyuni
Samsuddin
Ekawati Hamzah
AUTHORS
15. 1 — ABSTRCT
2 — PROBLEM STATEMENTS
3 — METHODOLOGY
4 — MAJOR CONTRIBUTION
6 — REWRITING THE PAPER
5 — CRITIQUE OF THE PAPER
16. The study focuses on assessing
the accuracy of determining the
Qibla direction in mosques within
Tanete Riattang and Tanete
Riattang Barat Districts
ABSTRACT
The goal is to evaluate the
precision of Qibla direction
measured through astronomy
(Google Earth) and examine
Islamic law perspectives on
this measurement.
The methodology involves a qualitative approach,
incorporating expert opinions and descriptive
qualitative data analysis. Results indicate that
using Google Earth is relatively easy, highly
accurate, and efficient for determining Qibla
direction, in contrast to less accurate methods
such as compasses and the sun's shadow.
17. PROBLEM STATEMENT
The problem addressed in this article revolves around the discrepancy
in determining the Qibla direction in mosques, particularly in Bone
Regency, Indonesia. The conflict arises between traditional beliefs,
often rooted in cultural customs and previous measurements by
revered scholars, and the scientific approach based on astronomy. The
resistance to change, influenced by cultural norms and reluctance to
adopt scientific advancements, results in significant deviations from
the accurate Qibla direction in many mosques, particularly in Bone
Regency.
18. The methodology employed in measuring the Qibla direction of four mosques in Bone
Regency involves using the Google Earth application. The selected sample includes Al
Mujahidin Old Mosque, Al Markas Al Ma’arif Mosque, Bone Grand Mosque, and Al
Rizkullah Mosque (Biru Islamic Boarding School). The study relies on observations
and accuracy assessments of each mosque's Qibla direction based on the Google
Earth software.
Each mosque's historical background, construction details, and significance are
provided to contextualize the findings. The results indicate deviations in Qibla
direction for three out of the four mosques, emphasizing the need for accurate
determination to align with Saudi Arabia. The methodology includes visual overviews
of the Qibla direction for each mosque, facilitating a comprehensive assessment of the
accuracy of their orientations.
METHODOLOGY
19. DEVIATE TO THE WEST
DIRECTION
DEVIATE TO THE EASTWEST
DIRECTION
RESULT ANALYSIS
20. DEVIATES TO THE WEST-NORTH
DIRECTION
EXACTLY IN THE DIRECTION
RESULT ANALYSIS
21. MAJOR CONTRIBUTION
The major contribution of the article is highlighting
Google Earth as an accessible and weather-independent
tool for calibrating the Qibla direction. Emphasizing its
ease of use, flexibility, and applicability, the article
suggests that Google Earth serves as a practical
alternative for Muslims to accurately determine the Qibla
direction without the need for in-depth knowledge of
astronomy.
22. CRITIQUE OF THE PAPER
Notable areas for improvement.
Firstly, the methodology lacks a clear explanation of how the Qibla direction was
determined for each mosque, particularly in terms of the Google Earth
application's settings or parameters used.
This lack of detail like to what degrees of deviation affects the transparency and
replicability of the study. Additionally, while historical and architectural
information is provided for each mosque, the relevance of such details to the
Qibla deviation is not established, leading to an inconsistency in the paper's
focus.
Moreover, the conclusion stating that deviations of approximately 2 degrees can
be tolerated lacks clarification on what criteria are considered for tolerance,
introducing ambiguity. Overall, addressing these gaps would enhance the paper's
methodological rigor and clarity.
23. REWRITING THE PAPER
I would have improved the paper by
offering a clearer explanation of how the
Qibla direction was determined for each
mosque, specifying the settings or
parameters utilized in the Google Earth
application. This detail is crucial for
ensuring transparency and facilitating the
replication of the study.
I would have established the relevance of
historical and architectural information to
the Qibla deviation, addressing the
inconsistency in the paper's focus.
Furthermore, I would have sought
clarification on the criteria for tolerating
deviations of approximately 2 degrees,
eliminating ambiguity in the conclusion.
Addressing these gaps could enhance the
paper's methodological rigor and overall
clarity.