Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update from IQSS develop #48

Merged
merged 34 commits into from
Mar 23, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
34 commits
Select commit Hold shift + click to select a range
efa5003
Merge branch 'develop' into 6665-index-dv-api-fails
sekmiller Feb 24, 2020
6663394
#6665 update ds solr docs directly
sekmiller Feb 27, 2020
5ea7a52
Merge branch 'develop' into 6665-index-dv-api-fails
sekmiller Feb 27, 2020
a94b391
#6665 add paths to files on index dataverse
sekmiller Mar 2, 2020
40de0a0
Merge branch 'develop' into 6665-index-dv-api-fails
sekmiller Mar 2, 2020
27370a9
Merge branch 'develop' into 6665-index-dv-api-fails
sekmiller Mar 4, 2020
bce3c33
Issue #6514: Implement affiliation reading from Shibboleth attribute.
pkiraly Mar 6, 2020
bd3f1d6
#6514: extend documentation.
pkiraly Mar 6, 2020
b67125b
adding release note file, some basic doc updates
djbrooke Mar 6, 2020
1085a15
#6665 remove path update from index all
sekmiller Mar 9, 2020
6617e4a
Merge branch 'develop' into 6665-index-dv-api-fails
sekmiller Mar 9, 2020
cf237af
doc edits
djbrooke Mar 10, 2020
c2133b1
add release note
djbrooke Mar 10, 2020
9261dda
Merge branch 'develop' into 6514-optionally-read-affiliation-from-shi…
djbrooke Mar 10, 2020
5a27cdf
changing JVM option to DB option
djbrooke Mar 10, 2020
3af998d
Merge branch 'develop' into 6665-index-dv-api-fails
sekmiller Mar 10, 2020
fdbf8db
Make BundleUtil.getDefaultLocale() respect sane system defaults. #6734
poikilotherm Mar 10, 2020
42f2852
Merge remote-tracking branch 'upstream/develop' into 6734-default-locale
poikilotherm Mar 10, 2020
4f321b8
#6665 removing variable metadata process for benchmarking
sekmiller Mar 10, 2020
022306d
#6736 Add Provenance Example File to Documentation.
pkiraly Mar 11, 2020
1809eb0
#6665 add debug lines for benchmarking with var metadata processing
sekmiller Mar 11, 2020
9cd8817
Merge branch 'develop' into 6665-index-dv-api-fails
sekmiller Mar 11, 2020
7856538
#6665 remove variable metadata indexing and debug statements
sekmiller Mar 12, 2020
c574792
Merge pull request #6704 from IQSS/6665-index-dv-api-fails
kcondon Mar 12, 2020
40eafcf
A *very* simple solution for the undetected database save fail in ing…
landreev Mar 16, 2020
34df93c
One extra dataFileService.save(), to ensure there's no optimistic loc…
landreev Mar 16, 2020
4e70dd6
#6742 update EC2 documentation
donsizemore Mar 16, 2020
d4bfae0
cleaned up logging messages (#6660)
landreev Mar 16, 2020
9466630
Merge branch 'develop' into 6660-restore-originals
landreev Mar 16, 2020
3adb738
Merge pull request #6743 from OdumInstitute/6742_update_EC2_documenta…
kcondon Mar 18, 2020
c1b8211
Merge pull request #6738 from pkiraly/6736-add-provenance-example-fil…
kcondon Mar 18, 2020
d3418bd
Merge pull request #6729 from pkiraly/6514-optionally-read-affiliatio…
kcondon Mar 18, 2020
5650ea5
Merge pull request #6735 from poikilotherm/6734-default-locale
kcondon Mar 18, 2020
04aee63
Merge pull request #6744 from IQSS/6660-restore-originals
kcondon Mar 19, 2020
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions doc/release-notes/6514-shib-updates
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
New DB option :ShibAffiliationAttribute
1 change: 1 addition & 0 deletions doc/sphinx-guides/source/_static/api/file-provenance.json
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
{"prefix": {"pre_0": "http://www.w3.org/2001/XMLSchema", "s-prov": "http://s-prov/ns/#", "provone": "http://purl.dataone.org/provone/2015/01/15/ontology#", "vargen": "http://openprovenance.org/vargen#", "foaf": "http://xmlns.com/foaf/0.1/", "dcterms": "http://purl.org/dc/terms/", "tmpl": "http://openprovenance.org/tmpl#", "var": "http://openprovenance.org/var#", "vcard": "http://www.w3.org/2006/vcard/ns#", "swirrl": "http://project-dare.eu/ns#"}, "bundle": {"vargen:SessionSnapshot": {"prefix": {"s-prov": "http://s-prov/ns/#", "provone": "http://purl.dataone.org/provone/2015/01/15/ontology#", "vargen": "http://openprovenance.org/vargen#", "tmpl": "http://openprovenance.org/tmpl#", "var": "http://openprovenance.org/var#", "vcard": "http://www.w3.org/2006/vcard/ns#", "swirrl": "http://project-dare.eu/ns#"}, "entity": {"vargen:inData": {"swirrl:volumeId": {"$": "var:rawVolumeId", "type": "prov:QUALIFIED_NAME"}, "prov:type": {"$": "provone:Data", "type": "prov:QUALIFIED_NAME"}}, "vargen:inFile": {"prov:atLocation": {"$": "var:atLocation", "type": "prov:QUALIFIED_NAME"}, "s-prov:format": {"$": "var:format", "type": "prov:QUALIFIED_NAME"}, "s-prov:checksum": {"$": "var:checksum", "type": "prov:QUALIFIED_NAME"}}, "vargen:WorkData": {"swirrl:volumeId": {"$": "var:workVolumeId", "type": "prov:QUALIFIED_NAME"}, "prov:type": {"$": "provone:Data", "type": "prov:QUALIFIED_NAME"}}, "var:JupSnapshot": {"prov:generatedAt": {"$": "var:generatedAt", "type": "prov:QUALIFIED_NAME"}, "prov:atLocation": {"$": "var:repoUrl", "type": "prov:QUALIFIED_NAME"}, "s-prov:description": {"$": "var:description", "type": "prov:QUALIFIED_NAME"}, "prov:type": {"$": "swirrl:NotebookSnapshot", "type": "prov:QUALIFIED_NAME"}, "swirrl:sessionId": {"$": "var:sessionId", "type": "prov:QUALIFIED_NAME"}}}, "used": {"_:id1": {"prov:activity": "vargen:snapshot", "prov:entity": "var:Jupyter"}, "_:id2": {"prov:activity": "vargen:snapshot", "prov:entity": "vargen:WorkData"}, "_:id3": {"prov:activity": "vargen:snapshot", "prov:entity": "vargen:inData"}}, "wasDerivedFrom": {"_:id4": {"prov:usedEntity": "var:Jupyter", "prov:generatedEntity": "var:JupSnapshot"}}, "wasAssociatedWith": {"_:id5": {"prov:activity": "vargen:snapshot", "prov:agent": "var:snapAgent"}}, "actedOnBehalfOf": {"_:id6": {"prov:delegate": "var:snapAgent", "prov:responsible": "var:user"}}, "activity": {"vargen:snapshot": {"prov:atLocation": {"$": "var:method_path", "type": "prov:QUALIFIED_NAME"}, "tmpl:startTime": {"$": "var:startTime", "type": "prov:QUALIFIED_NAME"}, "tmpl:endTime": {"$": "var:endTime", "type": "prov:QUALIFIED_NAME"}}}, "wasGeneratedBy": {"_:id7": {"prov:activity": "vargen:snapshot", "prov:entity": "var:JupSnapshot"}}, "agent": {"var:user": {"vcard:uid": {"$": "var:name", "type": "prov:QUALIFIED_NAME"}, "swirrl:authMode": {"$": "var:authmode", "type": "prov:QUALIFIED_NAME"}, "swirrl:group": {"$": "var:group", "type": "prov:QUALIFIED_NAME"}, "prov:type": {"$": "prov:Person", "type": "prov:QUALIFIED_NAME"}}, "var:snapAgent": {"vcard:uid": {"$": "var:name_api", "type": "prov:QUALIFIED_NAME"}, "prov:type": {"$": "prov:SoftwareAgent", "type": "prov:QUALIFIED_NAME"}}}, "hadMember": {"_:id8": {"prov:collection": "vargen:inData", "prov:entity": "vargen:inFile"}}}}}
2 changes: 2 additions & 0 deletions doc/sphinx-guides/source/api/native-api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2190,6 +2190,8 @@ The fully expanded example above (without environment variables) looks like this

curl -H "X-Dataverse-key:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" -X POST "https://demo.dataverse.org/api/files/:persistentId/prov-freeform?persistentId=doi:10.5072/FK2/AAA000" -H "Content-type:application/json" --upload-file provenance.json

See a sample JSON file :download:`file-provenance.json <../_static/api/file-provenance.json>` from http://openprovenance.org (c.f. Huynh, Trung Dong and Moreau, Luc (2014) ProvStore: a public provenance repository. At 5th International Provenance and Annotation Workshop (IPAW'14), Cologne, Germany, 09-13 Jun 2014. pp. 275-277).

Delete Provenance JSON for an uploaded file
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Expand Down
15 changes: 9 additions & 6 deletions doc/sphinx-guides/source/developers/deployment.rst
Original file line number Diff line number Diff line change
Expand Up @@ -82,23 +82,26 @@ Download and Run the "Create Instance" Script

Once you have done the configuration above, you are ready to try running the "ec2-create-instance.sh" script to spin up Dataverse in AWS.

Download :download:`ec2-create-instance.sh <../../../../scripts/installer/ec2-create-instance.sh>` and put it somewhere reasonable. For the purpose of these instructions we'll assume it's in the "Downloads" directory in your home directory.
Download :download:`ec2-create-instance.sh<https://raw.githubusercontent.com/IQSS/dataverse-ansible/master/ec2/ec2-create-instance.sh>` and put it somewhere reasonable. For the purpose of these instructions we'll assume it's in the "Downloads" directory in your home directory.

ec2-create-instance accepts a number few command-line switches:
To run it with default values you just need the script, but you may also want a current copy of the ansible :download:`group vars<https://raw.githubusercontent.com/IQSS/dataverse-ansible/master/defaults/main.yml>`_ file.

ec2-create-instance accepts a number of command-line switches, including:

* -r: GitHub Repository URL (defaults to https://github.com/IQSS/dataverse.git)
* -b: branch to build (defaults to develop)
* -p: pemfile directory (defaults to $HOME)
* -g: Ansible GroupVars file (if you wish to override role defaults)
* -h: help (displays usage for each available option)

``bash ~/Downloads/ec2-create-instance.sh -b develop -r https://github.com/scholarsportal/dataverse.git -g main.yml``

Now you will need to wait around 15 minutes until the deployment is finished. Eventually, the output should tell you how to access the installation of Dataverse in a web browser or via ssh. It will also provide instructions on how to delete the instance when you are finished with it. Please be aware that AWS charges per minute for a running instance. You can also delete your instance from https://console.aws.amazon.com/console/home?region=us-east-1 .
You will need to wait for 15 minutes or so until the deployment is finished, longer if you've enabled sample data and/or the API test suite. Eventually, the output should tell you how to access the installation of Dataverse in a web browser or via SSH. It will also provide instructions on how to delete the instance when you are finished with it. Please be aware that AWS charges per minute for a running instance. You may also delete your instance from https://console.aws.amazon.com/console/home?region=us-east-1 .

Caveats
~~~~~~~
Caveat Recipiens
~~~~~~~~~~~~~~~~

Please note that while the script should work fine on newish branches, older branches that have different dependencies such as an older version of Solr may not produce a working Dataverse installation. Your mileage may vary.
Please note that while the script should work well on new-ish branches, older branches that have different dependencies such as an older version of Solr may not produce a working Dataverse installation. Your mileage may vary.

----

Expand Down
29 changes: 20 additions & 9 deletions doc/sphinx-guides/source/developers/testing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -108,22 +108,33 @@ Unfortunately, the term "integration tests" can mean different things to differe
Running the Full API Test Suite Using EC2
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

To run the API test suite on EC2 you should first follow the steps in the :doc:`deployment` section to get set up for AWS in general and EC2 in particular.
To run the API test suite in an EC2 instance you should first follow the steps in the :doc:`deployment` section to get set up for AWS in general and EC2 in particular.

Then read the instructions in https://github.com/IQSS/dataverse-sample-data for EC2 but be sure to make the adjustments below.
You may always retrieve a current copy of the ec2-create-instance.sh script and accompanying group_var.yml file from the `dataverse-ansible repo<https://github.com/IQSS/dataverse-ansible/>`_:

Edit ``ec2config.yaml`` to change ``test_suite`` to ``true``.
- `ec2-create-instance.sh<https://raw.githubusercontent.com/IQSS/dataverse-ansible/master/ec2/ec2-create-instance.sh>`_
- `main.yml<https://raw.githubusercontent.com/IQSS/dataverse-ansible/master/defaults/main.yml>`_

Pass in the repo and branch you are testing. You should also specify a local directory where server.log and other useful information will be written so you can start debugging any failures.
Edit ``main.yml`` to set the desired GitHub repo, branch, and to ensure that the API test suite is enabled:

- ``dataverse_repo: https://github.com/IQSS/dataverse.git``
- ``dataverse_branch: develop``
- ``dataverse.api.test_suite: true``
- ``dataverse.sampledata.enabled: true``

If you wish, you may pass the local path of a logging directory, which will tell ec2-create-instance.sh to `grab glassfish, maven and other logs<https://github.com/IQSS/dataverse-ansible/blob/master/ec2/ec2-create-instance.sh#L185>`_ for your review.

Finally, run the script:

.. code-block:: bash

export REPO=https://github.com/IQSS/dataverse.git
export BRANCH=123-my-branch
export LOGS=/tmp/123
$ ./ec2-create-instance.sh -g main.yml -l log_dir

Near the beginning and at the end of the ec2-create-instance.sh output you will see instructions for connecting to the instance via SSH. If you are actively working on a branch and want to refresh the warfile after each commit, you may wish to call a `redeploy.sh<https://github.com/IQSS/dataverse-ansible/blob/master/templates/redeploy.sh.j2>`_ script placed by the Ansible role, which will do a "git pull" against your branch, build the warfile, deploy the warfile, then restart glassfish. By default this script is written to /tmp/dataverse/redeploy.sh. You may invoke the script by appending it to the SSH command in ec2-create's output:

.. code-block:: bash

mkdir $LOGS
./ec2-create-instance.sh -g ec2config.yaml -r $REPO -b $BRANCH -l $LOGS
$ ssh -i your_pem.pem [email protected] /tmp/dataverse/redeploy.sh

Running the full API test suite using Docker
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Expand Down
47 changes: 47 additions & 0 deletions doc/sphinx-guides/source/installation/config.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1777,6 +1777,53 @@ You can set the value of "#THIS PAGE#" to the URL of your Dataverse homepage, or

``curl -X PUT -d true http://localhost:8080/api/admin/settings/:ShibPassiveLoginEnabled``

:ShibAffiliationAttribute
+++++++++++++++++++++++++

The Shibboleth affiliation attribute holds information about the affiliation of the user (e.g. "OU") and is read from the DiscoFeed at each login. ``:ShibAffiliationAttribute`` is a name of a Shibboleth attribute in the Shibboleth header which Dataverse will read from instead of DiscoFeed. If this value is not set or empty, Dataverse uses the DiscoFeed.

If the attribute is not yet set for the Shibboleth, please consult the Shibboleth Administrators at your institution. Typically it requires changing of the `/etc/shibboleth/attribute-map.xml` file by adding an attribute request, e.g.

```
<Attribute name="urn:oid:2.5.4.11" id="ou">
<AttributeDecoder xsi:type="StringAttributeDecoder" caseSensitive="false"/>
</Attribute>
```

In order to implement the change, you should restart Shibboleth and Apache2 services:

```
sudo service shibd restart
sudo service apache2 restart
```

To check if the attribute is sent, you should log in again to Dataverse and check Shibboleth's transaction log. You should see something like this:

```
INFO Shibboleth-TRANSACTION [25]: Cached the following attributes with session (ID: _9d1f34c0733b61c0feb0ca7596ef43b2) for (applicationId: default) {
INFO Shibboleth-TRANSACTION [25]: givenName (1 values)
INFO Shibboleth-TRANSACTION [25]: ou (1 values)
INFO Shibboleth-TRANSACTION [25]: sn (1 values)
INFO Shibboleth-TRANSACTION [25]: eppn (1 values)
INFO Shibboleth-TRANSACTION [25]: mail (1 values)
INFO Shibboleth-TRANSACTION [25]: displayName (1 values)
INFO Shibboleth-TRANSACTION [25]: }
```

If you see the attribue you requested in this list, you can set the attribute in Dataverse.

To set ``:ShibAffiliationAttribute``:

``curl -X PUT -d "ou" http://localhost:8080/api/admin/settings/:ShibAffiliationAttribute``

To delete ``:ShibAffiliationAttribute``:

``curl -X DELETE http://localhost:8080/api/admin/settings/:ShibAffiliationAttribute``

To check the current value of ``:ShibAffiliationAttribute``:

``curl -X GET http://localhost:8080/api/admin/settings/:ShibAffiliationAttribute``

.. _:ComputeBaseUrl:

:ComputeBaseUrl
Expand Down
107 changes: 82 additions & 25 deletions scripts/installer/ec2-create-instance.sh
Original file line number Diff line number Diff line change
Expand Up @@ -3,21 +3,28 @@
# For docs, see the "Deployment" page in the Dev Guide.

# repo and branch defaults
REPO_URL='https://github.com/IQSS/dataverse.git'
BRANCH='develop'
REPO_URL_DEFAULT='https://github.com/IQSS/dataverse.git'
BRANCH_DEFAULT='develop'
PEM_DEFAULT=${HOME}
AWS_AMI_DEFAULT='ami-9887c6e7'

usage() {
echo "Usage: $0 -b <branch> -r <repo> -p <pem_dir> -g <group_vars>" 1>&2
echo "Usage: $0 -b <branch> -r <repo> -p <pem_dir> -g <group_vars> -a <dataverse-ansible branch> -i aws_image -s aws_size -t aws_tag -l local_log_path" 1>&2
echo "default branch is develop"
echo "default repo is https://github.com/IQSS/dataverse"
echo "default .pem location is ${HOME}"
echo "example group_vars may be retrieved from https://raw.githubusercontent.com/IQSS/dataverse-ansible/master/defaults/main.yml"
echo "default AWS AMI ID is $AWS_AMI_DEFAULT"
echo "default AWS size is t2.medium"
echo "local log path"
exit 1
}

while getopts ":r:b:g:p:" o; do
while getopts ":a:r:b:g:p:i:s:t:l:" o; do
case "${o}" in
a)
DA_BRANCH=${OPTARG}
;;
r)
REPO_URL=${OPTARG}
;;
Expand All @@ -30,32 +37,74 @@ while getopts ":r:b:g:p:" o; do
p)
PEM_DIR=${OPTARG}
;;
i)
AWS_IMAGE=${OPTARG}
;;
s)
AWS_SIZE=${OPTARG}
;;
t)
TAG=${OPTARG}
;;
l)
LOCAL_LOG_PATH=${OPTARG}
;;
*)
usage
;;
esac
done

# test for user-supplied conf files
# test for ansible group_vars
if [ ! -z "$GRPVRS" ]; then
GVFILE=$(basename "$GRPVRS")
GVARG="-e @$GVFILE"
echo "using $GRPVRS for extra vars"
fi

# test for CLI args
if [ ! -z "$REPO_URL" ]; then
GVARG+=" -e dataverse_repo=$REPO_URL"
echo "using $REPO_URL"
echo "using repo $REPO_URL"
fi

if [ ! -z "$BRANCH" ]; then
GVARG+=" -e dataverse_branch=$BRANCH"
echo "building $BRANCH"
echo "building branch $BRANCH"
fi

# The AMI ID may change in the future and the way to look it up is with the following command, which takes a long time to run:
# aws ec2 describe-images --owners 'aws-marketplace' --filters 'Name=product-code,Values=aw0evgkw8e5c1q413zgy5pjce' --query 'sort_by(Images, &CreationDate)[-1].[ImageId]' --output 'text'
# To use an AMI, one must subscribe to it via the AWS GUI.
# AMI IDs are specific to the region.

if [ ! -z "$AWS_IMAGE" ]; then
AMI_ID=$AWS_IMAGE
else
AMI_ID="$AWS_AMI_DEFAULT"
fi
echo "using $AMI_ID"

if [ ! -z "$AWS_SIZE" ]; then
SIZE=$AWS_SIZE
else
SIZE="t2.medium"
fi
echo "using $SIZE"

if [ ! -z "$TAG" ]; then
TAGARG="--tag-specifications ResourceType=instance,Tags=[{Key=name,Value=$TAG}]"
echo "using tag $TAG"
fi

# default to dataverse-ansible/master
if [ -z "$DA_BRANCH" ]; then
DA_BRANCH="master"
fi

# ansible doesn't care about pem_dir (yet)
if [ -z "$PEM_DIR" ]; then
PEM_DIR="$PEM_DEFAULT"
echo "using $PEM_DIR"
fi

AWS_CLI_VERSION=$(aws --version)
Expand Down Expand Up @@ -95,22 +144,12 @@ else
exit 1
fi

# The AMI ID may change in the future and the way to look it up is with the
# following command, which takes a long time to run:
#
# aws ec2 describe-images --owners 'aws-marketplace' --filters 'Name=product-code,Values=aw0evgkw8e5c1q413zgy5pjce' --query 'sort_by(Images, &CreationDate)[-1].[ImageId]' --output 'text'
#
# To use this AMI, we subscribed to it from the AWS GUI.
# AMI IDs are specific to the region.
AMI_ID='ami-9887c6e7'
# Smaller than medium lead to Maven and Solr problems.
SIZE='t2.medium'
echo "Creating EC2 instance"
# TODO: Add some error checking for "ec2 run-instances".
INSTANCE_ID=$(aws ec2 run-instances --image-id $AMI_ID --security-groups $SECURITY_GROUP --count 1 --instance-type $SIZE --key-name $PEM_DIR/$KEY_NAME --query 'Instances[0].InstanceId' --block-device-mappings '[ { "DeviceName": "/dev/sda1", "Ebs": { "DeleteOnTermination": true } } ]' | tr -d \")
INSTANCE_ID=$(aws ec2 run-instances --image-id $AMI_ID --security-groups $SECURITY_GROUP $TAGARG --count 1 --instance-type $SIZE --key-name $PEM_DIR/$KEY_NAME --query 'Instances[0].InstanceId' --block-device-mappings '[ { "DeviceName": "/dev/sda1", "Ebs": { "DeleteOnTermination": true } } ]' | tr -d \")
echo "Instance ID: "$INSTANCE_ID
echo "giving instance 30 seconds to wake up..."
sleep 30
echo "giving instance 60 seconds to wake up..."
sleep 60
echo "End creating EC2 instance"

PUBLIC_DNS=$(aws ec2 describe-instances --instance-ids $INSTANCE_ID --query "Reservations[*].Instances[*].[PublicDnsName]" --output text)
Expand All @@ -132,18 +171,36 @@ ssh -T -i $PEM_FILE -o 'StrictHostKeyChecking no' -o 'UserKnownHostsFile=/dev/nu
sudo yum -y install epel-release
sudo yum -y install https://releases.ansible.com/ansible/rpm/release/epel-7-x86_64/ansible-2.7.9-1.el7.ans.noarch.rpm
sudo yum -y install git nano
git clone https://github.com/IQSS/dataverse-ansible.git dataverse
git clone -b $DA_BRANCH https://github.com/IQSS/dataverse-ansible.git dataverse
export ANSIBLE_ROLES_PATH=.
echo $extra_vars
ansible-playbook -v -i dataverse/inventory dataverse/dataverse.pb --connection=local $GVARG
EOF

if [ ! -z "$LOCAL_LOG_PATH" ]; then
echo "copying logs to $LOCAL_LOG_PATH."
# 1 accept SSH keys
ssh-keyscan ${PUBLIC_DNS} >> ~/.ssh/known_hosts
# 2 logdir should exist
mkdir -p $LOCAL_LOG_PATH
# 3 grab logs for local processing in jenkins
rsync -av -e "ssh -i $PEM_FILE" --ignore-missing-args centos@$PUBLIC_DNS:/tmp/dataverse/target/site $LOCAL_LOG_PATH/
rsync -av -e "ssh -i $PEM_FILE" --ignore-missing-args centos@$PUBLIC_DNS:/tmp/dataverse/target/surefire-reports $LOCAL_LOG_PATH/
rsync -av -e "ssh -i $PEM_FILE" centos@$PUBLIC_DNS:/usr/local/glassfish4/glassfish/domains/domain1/logs/server* $LOCAL_LOG_PATH/
# 4 grab mvn.out
rsync -av -e "ssh -i $PEM_FILE" --ignore-missing-args centos@$PUBLIC_DNS:/tmp/dataverse/mvn.out $LOCAL_LOG_PATH/
# 5 jacoco
rsync -av -e "ssh -i $PEM_FILE" --ignore-missing-args centos@$PUBLIC_DNS:/tmp/dataverse/target/coverage-it $LOCAL_LOG_PATH/
rsync -av -e "ssh -i $PEM_FILE" --ignore-missing-args centos@$PUBLIC_DNS:/tmp/dataverse/target/*.exec $LOCAL_LOG_PATH/
rsync -av -e "ssh -i $PEM_FILE" --ignore-missing-args centos@$PUBLIC_DNS:/tmp/dataverse/target/classes $LOCAL_LOG_PATH/
rsync -av -e "ssh -i $PEM_FILE" --ignore-missing-args centos@$PUBLIC_DNS:/tmp/dataverse/src $LOCAL_LOG_PATH/
fi

# Port 8080 has been added because Ansible puts a redirect in place
# from HTTP to HTTPS and the cert is invalid (self-signed), forcing
# the user to click through browser warnings.
CLICKABLE_LINK="http://${PUBLIC_DNS}:8080"
CLICKABLE_LINK="http://${PUBLIC_DNS}"
echo "To ssh into the new instance:"
echo "ssh -i $PEM_FILE $USER_AT_HOST"
echo "Branch \"$BRANCH\" from $REPO_URL has been deployed to $CLICKABLE_LINK"
echo "Branch $BRANCH from $REPO_URL has been deployed to $CLICKABLE_LINK"
echo "When you are done, please terminate your instance with:"
echo "aws ec2 terminate-instances --instance-ids $INSTANCE_ID"
Loading