Atlantic Ocean Data Portal

Publikacja artykułów
atlanticAtlantic Ocean Data Portal
The US National Multi-Model Ensemble (NMME) is an experimental multi-model seasonal forecasting system consisting of coupled models from US modeling centers including NOAA/NCEP, NOAA/GFDL, IRI, NCAR, NASA, and Canada`s CMC. So it`s ok if open data portals suck” if resources are being spent instead on actual data release. Sounds exactly like the kind of value added service that the open data community can add, and I don`t see much wrong with that. I`ve been waiting to reply until we had some more written up about the Housing Data Hub ( ), so jumping in a little late to the game. But basically, we`ve been working for the past couple of months on building this and very actively working toward using our open data APIs in it. But, while I`m on that distinction, let me air my broader concern around dogfooding and open data portals.The Mid-Atlantic Ocean Data Portal is an online toolkit and resource center that consolidates available data and enables state, federal, and local users and the general public to visualize and analyze ocean resources and human use information such as fishing grounds, recreational areas, shipping lanes, habitat areas, and energy sites, among others.  There`s much more about the process of building here: -digital-barn Mostly, this gives the overview of our approach, but I`ll have deeper reflections/technical points on how we used the portal APIs and what we learned in doing so later on (likely as part of the code documentation). I think we are still at very early stages of maturity in the open data world around processes, methods and technologies.On Tuesday, officials announced that the California Department of Managed Health Care has published its first datasets to the state`s Open Data Portal, Techwire reports (Techwire, 5/27). This is why I wouldn`t build a product to access affordable housing using the open data portal as a backend. The current portal products are by their nature broad and apply to many different data and needs, but sometimes the requirements of the product may not be met by an open data portal. The housing data hub is an example where we`re dogfooding, and you`ll see more in a couple of months.The CMTS-MTS data portal is a comprehensive inventory of available web-based information regarding the Marine Transportation System (MTS).  For example, handling private data, complex relational data, or maintaining SLAs at much higher levels than open data portals can currently provide. I`m not saying you`re suggesting using open data portals for all government services, but I just wanted to take a moment to reiterate here that I think broadly we need to use the appropriate technology for the job, and consciously design open data access into those approaches so that surfacing the public data is straightforward.The New Mexico Resource Geographic Information System (RGIS) has launched a New Data Portal (-/) for New Mexico GIS resources.  All that said, I totally agree with the notion of dogfooding and what it can bring to government, but it is one piece of a broader strategy to institutionalize open data approaches at the City. The things we are learning will then be shared back to the world through our website and github repos, and internally through our City training program, the Data Academy ( ). We meet lots of small, medium and large enterprises that recognize they need to do something with Big Data. With the ORTEC Big Data Portal you don`t have to worry about the IT side or finding the right quants.
Just like the Software as a Service concept, Analytics as a Service simplifies. On actionable insights rather than IT. It reduces Big Data to something every company can do. Just as with Software as a Service, you leave the backend stuff to us. All you do is click and wait for your report to appear. Don`t let anyone tell you that our Big Data Portal is similar to their Big Data Platform. A Big Data Platform assumes you have some specialist expertise and capabilities in-house. This makes our Big Data Portal the solution for every company that wants the opportunities created by Big Data, without the hassle and expense of doing it yourself.One of the many great features of the new Specimen Records database at the Data Portal, is that the portal enables you to download the entire database as a single plain text table: over 3GB`s in size. The first part of the URL is the base URL and is the typical CKAN DataStore Data API endpoint for data search. The second part specifies which exact database on the Data Portal you`d like to search.Each database has it`s own 32-digit GUID to uniquely identify it. There are currently 25 different databases/datasets available at the NHM Data Portal including data from the PREDICTS project, assessing ecological diversity in changing terrestrial systems. The output doesn`t look pretty to human eyes, but to a computer this is cleanly-structured data and it can easily be further analysed, manipulated or converted. Coordinating with the Portal Team to track the overall work flow, Portal Team, and tracking project activities and grant deliverables. Compiling, updating and maintaining best available spatial data in collaboration with data providers.Updating and maintaining Data Portal data structure, background information and multimedia content via Data Portal`s administrative interface. Identifying and addressing high priority data development needs and customizing data products to optimize user experience and utility for ocean planning applications. Identifying and addressing Data Portal platform and feature set development needs to optimize user experience and utility for ocean planning applications.Liaising with Regional Planning Body working groups to align Data Portal resources to support Ocean Action Plan development and implementation. Staffing Data Portal exhibit at meetings and events, and offering hands-on demonstrations to interested parties. Receiving inquiries from general public about Data Portal and coordinating with Portal Team to provide responses. Assisting Data Portal users with questions regarding data, functionality, and application to ocean planning. Providing updates to MARCO staff (weekly) and the MARCO Management Board (monthly) on Portal Team completed and planned activities.
Demonstrated competency with data development, analysis, and storage, including spatial and non-spatial data. Effective communication skills and the proven ability to translate data needs and limitations to a non-technical audience. Familiarity with current and emerging coastal and ocean issues in the Mid-Atlantic region and associated data needs. Several years ago, when looking to update online mapping tools used for the Pittsburgh Neighborhood and Community Information System, we spent some time learning about how to structure a software selection process. Developing the data portal in house was an option we wanted to explore alongside all others.By doing so, we`d be in a much better position to move through the purchasing process in a way that was structured around the needs of the Regional Data Center and our users, and not the lowest price. We`re grateful to have had a number of people from the University of Pittsburgh, Allegheny County, City of Pittsburgh, Carnegie Mellon University, Open Pittsburgh (our Code for America Brigade), and U.S. Open Data involved in our selection process. The conversations around software selection also influenced our thinking about how to structure the Data Center`s staffing, programs, and activities.To structure the software selection process, we issued a Request for Information (RFI) on December, 15, 2014. In this open request, we asked respondents to tell us why we should consider their product for use with the Data Center. Phone interviews with each of the firms making the second-round of consideration were conducted by the Regional Data Center project manager. The Western Pennsylvania Regional Data Center supports key community initiatives by making public information easier to find and use. The Data Center also hosts datasets from these and other public sector agencies, academic institutions, and non-profit organizations.The Data Center provides a technological and legal infrastructure for data sharing to support a growing ecosystem of data providers and data users. The Data Center maintains Allegheny County and the City of Pittsburgh`s open data portal, and provides a number of services to data publishers and users. The Data Center is managed by the University of Pittsburgh`s Center for Social and Urban Research, and is a partnership of the University, Allegheny County and the City of Pittsburgh.The portal allows anyone to easily search, explore, link, download and reuse the data for commercial or non-commercial purposes, through a catalogue of common metadata. Through this catalogue, users access data stored on the websites of the EU institutions, agencies and other bodies. The metadata catalogue can be searched via an interactive search engine (Data tab) and through SPARQL queries (Linked data tab). The number of data providers — which include Eurostat , the European Environment Agency and the Joint Research Centre — continues to grow.