Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update #42

Merged
merged 34 commits into from
Jan 24, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
34 commits
Select commit Hold shift + click to select a range
4b06b8f
expose dataset ID and DOI/Handle in version response #6397
pdurbin Jan 13, 2020
4b49911
initial consolidation of the individual release notes files into 4.19
djbrooke Jan 15, 2020
48df526
a few more updates
djbrooke Jan 15, 2020
f39bf30
a few more updates before a PR
djbrooke Jan 17, 2020
f244e88
OIDC updates
djbrooke Jan 17, 2020
4010d41
reordering, adding more info for search API and metadata blocks
djbrooke Jan 17, 2020
ea94b0e
add section on Payara, links, other tweaks #6506
pdurbin Jan 17, 2020
57c5b97
changed datasetPid to datasetPersistentId
scolapasta Jan 17, 2020
a0f1330
changed datasetPid to datasetPersistentId
scolapasta Jan 17, 2020
f97057d
updating metadata block customization doc
qqmyers Jan 21, 2020
930e017
Merge pull request #6511 from IQSS/6397-datasetIdPid
kcondon Jan 21, 2020
0cade62
Merge pull request #6539 from IQSS/6506-4.19-release-notes
kcondon Jan 21, 2020
b2c84d3
Update pom.xml
kcondon Jan 21, 2020
5697d44
Update conf.py
kcondon Jan 21, 2020
6e68608
Update versions.rst
kcondon Jan 21, 2020
7dd1387
Merge pull request #6551 from IQSS/6550_update_version_to_4_19
kcondon Jan 21, 2020
bfe2327
Merge pull request #6548 from QualitativeDataRepository/IQSS/6547
kcondon Jan 22, 2020
6d04ca1
Fixed section ref links in guides [ref #6560]
mheppler Jan 23, 2020
8564e9a
Fixed section ref links in guides [ref #6560]
mheppler Jan 23, 2020
0d4b927
Fix section ref links in guides [ref #6560]
mheppler Jan 23, 2020
b6550a4
#6566 initial commit
donsizemore Jan 23, 2020
14857b7
removing "preparation of"
djbrooke Jan 23, 2020
d9df8d9
typo, removing two words
djbrooke Jan 23, 2020
7bd57d4
community feedback update
djbrooke Jan 23, 2020
f6d2973
updating file name, adding queries link
djbrooke Jan 24, 2020
07a0aca
updating index link to new page name
djbrooke Jan 24, 2020
1f644bf
adding SSHOC thanks, adding spreadsheet previewer
djbrooke Jan 24, 2020
9615fa6
added doc about reexports
djbrooke Jan 24, 2020
1d46871
Merge pull request #6567 from OdumInstitute/6566_normalize_native_api…
kcondon Jan 24, 2020
694b850
Merge pull request #6565 from IQSS/6560-guides-section-ref-links
kcondon Jan 24, 2020
b467701
Merge pull request #6568 from IQSS/4170-reinstall-text-update
kcondon Jan 24, 2020
2325ece
Merge pull request #6569 from IQSS/4169-useful-queries
kcondon Jan 24, 2020
5cbce1f
Merge pull request #6571 from IQSS/6256-spreadsheet-viewer-doc
kcondon Jan 24, 2020
80c3bbe
Merge pull request #6572 from IQSS/5952-reexportall-doc-update
kcondon Jan 24, 2020
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 0 additions & 12 deletions doc/release-notes/3937-new-installer-script.md

This file was deleted.

121 changes: 119 additions & 2 deletions doc/release-notes/4.19-release-notes.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,125 @@
# Dataverse 4.19

Update Geospatial Metadata Block
This release brings new features, enhancements, and bug fixes to Dataverse. Thank you to all of the community members who contributed code, suggestions, bug reports, and other assistance across the project.

This update adds commas separating the values entered into Geographic Coverage.
## Release Highlights

### Open ID Connect Support

Dataverse now provides basic support for any OpenID Connect (OIDC) compliant authentication provider.

Prior to supporting this standard, new authentication methods needed to be added by pull request. OIDC support provides a standardized way for authentication, sharing user information, and more. You are able to use any compliant provider just by loading a configuration file, without touching the codebase. While the usual prominent providers like Google and others feature OIDC support there are plenty of other options to easily attach your installation to a custom authentication provider, using enterprise grade software.

See the [OpenID Connect Login Options documentation](http://guides.dataverse.org/en/4.19/installation/oidc.html) in the Installation Guide for more details.

This is to be extended with support for attribute mapping, group syncing and more in future versions of the code.

### Python Installer

We are introducing a new installer script, written in Python. It is intended to eventually replace the old installer (written in Perl). For now it is being offered as an (experimental) alternative.

See [README_python.txt](https://github.com/IQSS/dataverse/blob/v4.19/scripts/installer/README_python.txt) in scripts/installer and/or in the installer bundle for more information.

## Major Use Cases

Newly-supported use cases in this release include:

- Dataverse installation administrators will be able to experiment with a Python Installer (Issue #3937, PR #6484)
- Dataverse installation administrators will be able to set up an OIDC-compliant login options by editing a configuration file and with no need for a code change (Issue #6432, PR #6433)
- Following setup by a Dataverse administration, users will be able to log in using OIDC-compliant methods (Issue #6432, PR #6433)
- Users of the Search API will see additional fields in the JSON output (Issues #6300, #6396, PR #6441)
- Users loading the support form will now be presented with the math challenge as expected and will be able to successfully send an email to support (Issue #6307, PR #6462)
- Users of https://mybinder.org can now spin up Jupyter Notebooks and other computational environments from Dataverse DOIs (Issue #4714, PR #6453)

## Notes for Dataverse Installation Administrators

### Security vulnerability in Solr

A serious security issue has recently been identified in multiple versions of Solr search engine, including v.7.3 that Dataverse is currently using. Follow the instructions below to verify that your installation is safe from a potential attack. You can also consult the following link for a detailed description of the issue:

<A HREF="https://github.com/veracode-research/solr-injection#7-cve-2019-xxxx-rce-via-velocity-template-by-_s00py">RCE in Solr via Velocity Template</A>.

The vulnerability allows an intruder to execute arbitrary code on the system running Solr. Fortunately, it can only be exploited if Solr API access point is open to direct access from public networks (aka, "the outside world"), which is NOT needed in a Dataverse installation.

We have always recommended having Solr (port 8983) firewalled off from public access in our installation guides. But we recommend that you double-check your firewall settings and verify that the port is not accessible from outside networks. The simplest quick test is to try the following URL in your browser:

`http://<your Solr server address>:8983`

and confirm that you get "access denied" or that it times out, etc.

In most cases, when Solr runs on the same server as the Dataverse web application, you will only want the port accessible from localhost. We also recommend that you add the following arguments to the Solr startup command: `-j jetty.host=127.0.0.1`. This will make Solr accept connections from localhost only; adding redundancy, in case of the firewall failure.

In a case where Solr needs to run on a different host, make sure that the firewall limits access to the port only to the Dataverse web host(s), by specific ip address(es).

We would also like to reiterate that it is simply never a good idea to run Solr as root! Running the process as a non-privileged user would substantially minimize any potential damage even in the event that the instance is compromised.

### Citation and Geospatial Metadata Block Updates

We updated two metadata blocks in this release. Updating these metadata blocks is mentioned in the step-by-step upgrade instructions below.

### Run ReExportall

We made changes to the JSON Export in this release (#6246). If you'd like these changes to reflected in your JSON exports, you should run ReExportall as part of the upgrade process. We've included this in the step-by-step instructions below.

### BinderHub

https://mybinder.org now supports spinning up Jupyter Notebooks and other computational environments from Dataverse DOIs.

### Widgets update for OpenScholar

We updated the code for widgets so that they will keep working in OpenScholar sites after the upcoming upgrade OpenScholar upgrade to Drupal 8. If users of your dataverse have embedded widgets on an Openscholar site that upgrades to Drupal 8, you will need to run this Dataverse version (or later) for the widgets to keep working.

### Payara tech preview

Dataverse 4 has always run on Glassfish 4.1 but changes in this release (PR #6523) should open the door to upgrading to Payara 5 eventually. Production installations of Dataverse should remain on Glassfish 4.1 but feedback from any experiments running Dataverse on Payara 5 is welcome via the [usual channels](https://dataverse.org/contact).

## Notes for Tool Developers and Integrators

### Search API

The boolean parameter `query_entities` has been removed from the Search API. The former "true" behavior of "whether entities are queried via direct database calls (for developer use)" is now always true.

Additional fields are now available via the Search API, mostly related to information about specific dataset versions.

## Complete List of Changes

For the complete list of code changes in this release, see the <a href="https://github.com/IQSS/dataverse/milestone/86?closed=1">4.19 milestone</a> in Github.

For help with upgrading, installing, or general questions please post to the <a href="https://groups.google.com/forum/#!forum/dataverse-community">Dataverse Google Group</a> or email [email protected].

## Installation

If this is a new installation, please see our <a href="http://guides.dataverse.org/en/4.19/installation/">Installation Guide</a>.

## Upgrade

1. Undeploy the previous version.

- &lt;glassfish install path&gt;/glassfish4/bin/asadmin list-applications
- &lt;glassfish install path&gt;/glassfish4/bin/asadmin undeploy dataverse

2. Stop glassfish and remove the generated directory, start.

- service glassfish stop
- remove the generated directory: rm -rf &lt;glassfish install path&gt;glassfish4/glassfish/domains/domain1/generated
- service glassfish start

3. Deploy this version.

- &lt;glassfish install path&gt;/glassfish4/bin/asadmin deploy &lt;path&gt;dataverse-4.19.war

4. Restart glassfish.

5. Update Citation Metadata Block

- `wget https://github.com/IQSS/dataverse/releases/download/v4.19/citation.tsv`
- `curl http://localhost:8080/api/admin/datasetfield/load -X POST --data-binary @citation.tsv -H "Content-type: text/tab-separated-values"`

6. Update Geospatial Metadata Block

- `wget https://github.com/IQSS/dataverse/releases/download/v4.19/geospatial.tsv`
- `curl http://localhost:8080/api/admin/datasetfield/load -X POST --data-binary @geospatial.tsv -H "Content-type: text/tab-separated-values"`

7. (Optional) Run ReExportall to update JSON Exports

<http://guides.dataverse.org/en/4.19/admin/metadataexport.html?highlight=export#batch-exports-through-the-api>
1 change: 0 additions & 1 deletion doc/release-notes/4714-binder.md

This file was deleted.

13 changes: 0 additions & 13 deletions doc/release-notes/6300-6396-search-api.md

This file was deleted.

14 changes: 0 additions & 14 deletions doc/release-notes/6426-reexport-all

This file was deleted.

16 changes: 0 additions & 16 deletions doc/release-notes/6432-basic-oidc-support.md

This file was deleted.

39 changes: 0 additions & 39 deletions doc/release-notes/dataverse-security-10-solr-vulnerability.md

This file was deleted.

Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
TwoRavens explore file A system of interlocking statistical tools for data exploration, analysis, and meta-analysis: http://2ra.vn. See the :doc:`/user/data-exploration/tworavens` section of the User Guide for more information on TwoRavens from the user perspective and the :doc:`/installation/r-rapache-tworavens` section of the Installation Guide.
Data Explorer explore file A GUI which lists the variables in a tabular data file allowing searching, charting and cross tabulation analysis. See the README.md file at https://github.com/scholarsportal/Dataverse-Data-Explorer for the instructions on adding Data Explorer to your Dataverse; and the :doc:`/installation/prerequisites` section of the Installation Guide for the instructions on how to set up **basic R configuration required** (specifically, Dataverse uses R to generate .prep metadata files that are needed to run Data Explorer).
Whole Tale explore dataset A platform for the creation of reproducible research packages that allows users to launch containerized interactive analysis environments based on popular tools such as Jupyter and RStudio. Using this integration, Dataverse users can launch Jupyter and RStudio environments to analyze published datasets. For more information, see the `Whole Tale User Guide <https://wholetale.readthedocs.io/en/stable/users_guide/integration.html>`_.
File Previewers explore file A set of tools that display the content of files - including audio, html, `Hypothes.is <https://hypothes.is/>`_ annotations, images, PDF, text, video - allowing them to be viewed without downloading. The previewers can be run directly from github.io, so the only required step is using the Dataverse API to register the ones you want to use. Documentation, including how to optionally brand the previewers, and an invitation to contribute through github are in the README.md file. https://github.com/QualitativeDataRepository/dataverse-previewers
File Previewers explore file A set of tools that display the content of files - including audio, html, `Hypothes.is <https://hypothes.is/>`_ annotations, images, PDF, text, video, tabular data, and spreadsheets - allowing them to be viewed without downloading. The previewers can be run directly from github.io, so the only required step is using the Dataverse API to register the ones you want to use. Documentation, including how to optionally brand the previewers, and an invitation to contribute through github are in the README.md file. Initial development was led by the Qualitative Data Repository and the spreasdheet previewer was added by the Social Sciences and Humanities Open Cloud (SSHOC) project. https://github.com/QualitativeDataRepository/dataverse-previewers
Data Curation Tool configure file A GUI for curating data by adding labels, groups, weights and other details to assist with informed reuse. See the README.md file at https://github.com/scholarsportal/Dataverse-Data-Curation-Tool for the installation instructions.
4 changes: 2 additions & 2 deletions doc/sphinx-guides/source/admin/dataverses-datasets.rst
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ Moves a dataverse whose id is passed to a new dataverse whose id is passed. The
Link a Dataverse
^^^^^^^^^^^^^^^^

Creates a link between a dataverse and another dataverse (see the Linked Dataverses + Linked Datasets section of the :doc:`/user/dataverse-management` guide for more information). Only accessible to superusers. ::
Creates a link between a dataverse and another dataverse (see the :ref:`dataverse-linking` section of the User Guide for more information). Only accessible to superusers. ::

curl -H "X-Dataverse-key: $API_TOKEN" -X PUT http://$SERVER/api/dataverses/$linked-dataverse-alias/link/$linking-dataverse-alias

Expand Down Expand Up @@ -55,7 +55,7 @@ Moves a dataset whose id is passed to a dataverse whose alias is passed. If the
Link a Dataset
^^^^^^^^^^^^^^

Creates a link between a dataset and a dataverse (see the Linked Dataverses + Linked Datasets section of the :doc:`/user/dataverse-management` guide for more information). ::
Creates a link between a dataset and a dataverse (see the :ref:`dataset-linking` section of the User Guide for more information). ::

curl -H "X-Dataverse-key: $API_TOKEN" -X PUT http://$SERVER/api/datasets/$linked-dataset-id/link/$linking-dataverse-alias

Expand Down
4 changes: 2 additions & 2 deletions doc/sphinx-guides/source/admin/harvestserver.rst
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ be used to create an OAI set. Sets can overlap local dataverses, and
can include as few or as many of your local datasets as you wish. A
good way to master the Dataverse search query language is to
experiment with the Advanced Search page. We also recommend that you
consult the Search API section of the Dataverse User Guide.
consult the :doc:`/api/search` section of the API Guide.

Once you have entered the search query and clicked *Next*, the number
of search results found will be shown on the next screen. This way, if
Expand Down Expand Up @@ -138,7 +138,7 @@ runs every night (at 2AM, by default). This export timer is created
and activated automatically every time the application is deployed
or restarted. Once again, this is new in Dataverse 4, and unlike DVN
v3, where export jobs had to be scheduled and activated by the admin
user. See the "Export" section of the Admin guide, for more information on the automated metadata exports.
user. See the :doc:`/admin/metadataexport` section of the Admin Guide, for more information on the automated metadata exports.

It is still possible however to make changes like this be immediately
reflected in the OAI server, by going to the *Harvesting Server* page
Expand Down
2 changes: 1 addition & 1 deletion doc/sphinx-guides/source/admin/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ This guide documents the functionality only available to superusers (such as "da
solr-search-index
ip-groups
monitoring
reporting-tools
reporting-tools-and-queries
maintenance
backups
troubleshooting
4 changes: 2 additions & 2 deletions doc/sphinx-guides/source/admin/make-data-count.rst
Original file line number Diff line number Diff line change
Expand Up @@ -61,9 +61,9 @@ If you haven't already, follow the steps for installing Counter Processor in the
Enable Logging for Make Data Count
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

To make Dataverse log dataset usage (views and downloads) for Make Data Count, you must set the ``:MDCLogPath`` database setting. See :ref:`MDCLogPath` for details.
To make Dataverse log dataset usage (views and downloads) for Make Data Count, you must set the ``:MDCLogPath`` database setting. See :ref:`:MDCLogPath` for details.

If you wish to start logging in advance of setting up other components, or wish to log without display MDC metrics for any other reason, you can set the optional ``:DisplayMDCMetrics`` database setting to false. See :ref:`DisplayMDCMetrics` for details.
If you wish to start logging in advance of setting up other components, or wish to log without display MDC metrics for any other reason, you can set the optional ``:DisplayMDCMetrics`` database setting to false. See :ref:`:DisplayMDCMetrics` for details.

After you have your first day of logs, you can process them the next day.

Expand Down
Loading