Skip to content

Commit

Permalink
Merge branch 'develop' of github.com:IQSS/dataverse into 10943-featur…
Browse files Browse the repository at this point in the history
…ed-items
  • Loading branch information
GPortas committed Jan 10, 2025
2 parents 3df4723 + 6d6a509 commit b2c918a
Show file tree
Hide file tree
Showing 13 changed files with 151 additions and 50 deletions.
4 changes: 4 additions & 0 deletions doc/release-notes/10171-exlude-metadatablocks.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
Extension of API `{id}/versions` and `{id}/versions/{versionId}` with an optional ``excludeMetadataBlocks`` parameter,
that specifies whether the metadataBlocks should be listed in the output. It defaults to ``false``, preserving backward
compatibility. (Note that for a dataset with a large number of versions and/or metadataBlocks having the metadata blocks
included can dramatically increase the volume of the output). See also [the guides](https://dataverse-guide--10778.org.readthedocs.build/en/10778/api/native-api.html#list-versions-of-a-dataset), #10778, and #10171.
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
## Improvement and internationalization of harvest status

Added a harvest status to differentiate a complete harvest with errors (Completed with failures) and without errors (Completed)
Harvest status labels are now internationalized

For more information, see issue [#9294](https://github.com/IQSS/dataverse/issues/9294)
8 changes: 8 additions & 0 deletions doc/sphinx-guides/source/api/native-api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1295,6 +1295,8 @@ It returns a list of versions with their metadata, and file list:
The optional ``excludeFiles`` parameter specifies whether the files should be listed in the output. It defaults to ``true``, preserving backward compatibility. (Note that for a dataset with a large number of versions and/or files having the files included can dramatically increase the volume of the output). A separate ``/files`` API can be used for listing the files, or a subset thereof in a given version.

The optional ``excludeMetadataBlocks`` parameter specifies whether the metadata blocks should be listed in the output. It defaults to ``false``, preserving backward compatibility. (Note that for a dataset with a large number of versions and/or metadata blocks having the metadata blocks included can dramatically increase the volume of the output).

The optional ``offset`` and ``limit`` parameters can be used to specify the range of the versions list to be shown. This can be used to paginate through the list in a dataset with a large number of versions.


Expand All @@ -1319,6 +1321,12 @@ The fully expanded example above (without environment variables) looks like this
The optional ``excludeFiles`` parameter specifies whether the files should be listed in the output (defaults to ``true``). Note that a separate ``/files`` API can be used for listing the files, or a subset thereof in a given version.

.. code-block:: bash
curl "https://demo.dataverse.org/api/datasets/24/versions/1.0?excludeMetadataBlocks=false"
The optional ``excludeMetadataBlocks`` parameter specifies whether the metadata blocks should be listed in the output (defaults to ``false``).


By default, deaccessioned dataset versions are not included in the search when applying the :latest or :latest-published identifiers. Additionally, when filtering by a specific version tag, you will get a "not found" error if the version is deaccessioned and you do not enable the ``includeDeaccessioned`` option described below.

Expand Down
11 changes: 7 additions & 4 deletions src/main/java/edu/harvard/iq/dataverse/api/Datasets.java
Original file line number Diff line number Diff line change
Expand Up @@ -421,15 +421,16 @@ public Response useDefaultCitationDate(@Context ContainerRequestContext crc, @Pa
@GET
@AuthRequired
@Path("{id}/versions")
public Response listVersions(@Context ContainerRequestContext crc, @PathParam("id") String id, @QueryParam("excludeFiles") Boolean excludeFiles, @QueryParam("limit") Integer limit, @QueryParam("offset") Integer offset) {
public Response listVersions(@Context ContainerRequestContext crc, @PathParam("id") String id, @QueryParam("excludeFiles") Boolean excludeFiles,@QueryParam("excludeMetadataBlocks") Boolean excludeMetadataBlocks, @QueryParam("limit") Integer limit, @QueryParam("offset") Integer offset) {

return response( req -> {
Dataset dataset = findDatasetOrDie(id);
Boolean deepLookup = excludeFiles == null ? true : !excludeFiles;
Boolean includeMetadataBlocks = excludeMetadataBlocks == null ? true : !excludeMetadataBlocks;

return ok( execCommand( new ListVersionsCommand(req, dataset, offset, limit, deepLookup) )
.stream()
.map( d -> json(d, deepLookup) )
.map( d -> json(d, deepLookup, includeMetadataBlocks) )
.collect(toJsonArray()));
}, getRequestUser(crc));
}
Expand All @@ -441,6 +442,7 @@ public Response getVersion(@Context ContainerRequestContext crc,
@PathParam("id") String datasetId,
@PathParam("versionId") String versionId,
@QueryParam("excludeFiles") Boolean excludeFiles,
@QueryParam("excludeMetadataBlocks") Boolean excludeMetadataBlocks,
@QueryParam("includeDeaccessioned") boolean includeDeaccessioned,
@QueryParam("returnOwners") boolean returnOwners,
@Context UriInfo uriInfo,
Expand All @@ -466,11 +468,12 @@ public Response getVersion(@Context ContainerRequestContext crc,
if (excludeFiles == null ? true : !excludeFiles) {
requestedDatasetVersion = datasetversionService.findDeep(requestedDatasetVersion.getId());
}
Boolean includeMetadataBlocks = excludeMetadataBlocks == null ? true : !excludeMetadataBlocks;

JsonObjectBuilder jsonBuilder = json(requestedDatasetVersion,
null,
excludeFiles == null ? true : !excludeFiles,
returnOwners);
excludeFiles == null ? true : !excludeFiles,
returnOwners, includeMetadataBlocks);
return ok(jsonBuilder);

}, getRequestUser(crc));
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,10 @@
package edu.harvard.iq.dataverse.harvest.client;

import java.io.Serializable;
import java.util.Arrays;
import java.util.Date;

import edu.harvard.iq.dataverse.util.BundleUtil;
import jakarta.persistence.Entity;
import jakarta.persistence.GeneratedValue;
import jakarta.persistence.GenerationType;
Expand Down Expand Up @@ -40,13 +43,7 @@ public void setId(Long id) {
this.id = id;
}

public enum RunResultType { SUCCESS, FAILURE, INPROGRESS, INTERRUPTED };

private static String RESULT_LABEL_SUCCESS = "SUCCESS";
private static String RESULT_LABEL_FAILURE = "FAILED";
private static String RESULT_LABEL_INPROGRESS = "IN PROGRESS";
private static String RESULT_DELETE_IN_PROGRESS = "DELETE IN PROGRESS";
private static String RESULT_LABEL_INTERRUPTED = "INTERRUPTED";
public enum RunResultType { COMPLETED, COMPLETED_WITH_FAILURES, FAILURE, IN_PROGRESS, INTERRUPTED }

@ManyToOne
@JoinColumn(nullable = false)
Expand All @@ -68,36 +65,43 @@ public RunResultType getResult() {

public String getResultLabel() {
if (harvestingClient != null && harvestingClient.isDeleteInProgress()) {
return RESULT_DELETE_IN_PROGRESS;
return BundleUtil.getStringFromBundle("harvestclients.result.deleteInProgress");
}

if (isSuccess()) {
return RESULT_LABEL_SUCCESS;

if (isCompleted()) {
return BundleUtil.getStringFromBundle("harvestclients.result.completed");
} else if (isCompletedWithFailures()) {
return BundleUtil.getStringFromBundle("harvestclients.result.completedWithFailures");
} else if (isFailed()) {
return RESULT_LABEL_FAILURE;
return BundleUtil.getStringFromBundle("harvestclients.result.failure");
} else if (isInProgress()) {
return RESULT_LABEL_INPROGRESS;
return BundleUtil.getStringFromBundle("harvestclients.result.inProgess");
} else if (isInterrupted()) {
return RESULT_LABEL_INTERRUPTED;
return BundleUtil.getStringFromBundle("harvestclients.result.interrupted");
}
return null;
}

public String getDetailedResultLabel() {
if (harvestingClient != null && harvestingClient.isDeleteInProgress()) {
return RESULT_DELETE_IN_PROGRESS;
return BundleUtil.getStringFromBundle("harvestclients.result.deleteInProgress");
}
if (isSuccess() || isInterrupted()) {
if (isCompleted() || isCompletedWithFailures() || isInterrupted()) {
String resultLabel = getResultLabel();

resultLabel = resultLabel.concat("; "+harvestedDatasetCount+" harvested, ");
resultLabel = resultLabel.concat(deletedDatasetCount+" deleted, ");
resultLabel = resultLabel.concat(failedDatasetCount+" failed.");

String details = BundleUtil.getStringFromBundle("harvestclients.result.details", Arrays.asList(
harvestedDatasetCount.toString(),
deletedDatasetCount.toString(),
failedDatasetCount.toString()
));
if(details != null) {
resultLabel = resultLabel + "; " + details;
}
return resultLabel;
} else if (isFailed()) {
return RESULT_LABEL_FAILURE;
return BundleUtil.getStringFromBundle("harvestclients.result.failure");
} else if (isInProgress()) {
return RESULT_LABEL_INPROGRESS;
return BundleUtil.getStringFromBundle("harvestclients.result.inProgess");
}
return null;
}
Expand All @@ -106,12 +110,20 @@ public void setResult(RunResultType harvestResult) {
this.harvestResult = harvestResult;
}

public boolean isSuccess() {
return RunResultType.SUCCESS == harvestResult;
public boolean isCompleted() {
return RunResultType.COMPLETED == harvestResult;
}

public void setCompleted() {
harvestResult = RunResultType.COMPLETED;
}

public boolean isCompletedWithFailures() {
return RunResultType.COMPLETED_WITH_FAILURES == harvestResult;
}

public void setSuccess() {
harvestResult = RunResultType.SUCCESS;
public void setCompletedWithFailures() {
harvestResult = RunResultType.COMPLETED_WITH_FAILURES;
}

public boolean isFailed() {
Expand All @@ -123,12 +135,12 @@ public void setFailed() {
}

public boolean isInProgress() {
return RunResultType.INPROGRESS == harvestResult ||
return RunResultType.IN_PROGRESS == harvestResult ||
(harvestResult == null && startTime != null && finishTime == null);
}

public void setInProgress() {
harvestResult = RunResultType.INPROGRESS;
harvestResult = RunResultType.IN_PROGRESS;
}

public boolean isInterrupted() {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -163,7 +163,7 @@ public void doHarvest(DataverseRequest dataverseRequest, Long harvestingClientId

try {
if (harvestingClientConfig.isHarvestingNow()) {
hdLogger.log(Level.SEVERE, "Cannot start harvest, client " + harvestingClientConfig.getName() + " is already harvesting.");
hdLogger.log(Level.SEVERE, String.format("Cannot start harvest, client %s is already harvesting.", harvestingClientConfig.getName()));

} else {
harvestingClientService.resetHarvestInProgress(harvestingClientId);
Expand All @@ -176,9 +176,16 @@ public void doHarvest(DataverseRequest dataverseRequest, Long harvestingClientId
} else {
throw new IOException("Unsupported harvest type");
}
harvestingClientService.setHarvestSuccess(harvestingClientId, new Date(), harvestedDatasetIds.size(), failedIdentifiers.size(), deletedIdentifiers.size());
hdLogger.log(Level.INFO, "COMPLETED HARVEST, server=" + harvestingClientConfig.getArchiveUrl() + ", metadataPrefix=" + harvestingClientConfig.getMetadataPrefix());
hdLogger.log(Level.INFO, "Datasets created/updated: " + harvestedDatasetIds.size() + ", datasets deleted: " + deletedIdentifiers.size() + ", datasets failed: " + failedIdentifiers.size());

if (failedIdentifiers.isEmpty()) {
harvestingClientService.setHarvestCompleted(harvestingClientId, new Date(), harvestedDatasetIds.size(), failedIdentifiers.size(), deletedIdentifiers.size());
hdLogger.log(Level.INFO, String.format("\"COMPLETED HARVEST, server=%s, metadataPrefix=%s", harvestingClientConfig.getArchiveUrl(), harvestingClientConfig.getMetadataPrefix()));
} else {
harvestingClientService.setHarvestCompletedWithFailures(harvestingClientId, new Date(), harvestedDatasetIds.size(), failedIdentifiers.size(), deletedIdentifiers.size());
hdLogger.log(Level.INFO, String.format("\"COMPLETED HARVEST WITH FAILURES, server=%s, metadataPrefix=%s", harvestingClientConfig.getArchiveUrl(), harvestingClientConfig.getMetadataPrefix()));
}

hdLogger.log(Level.INFO, String.format("Datasets created/updated: %s, datasets deleted: %s, datasets failed: %s", harvestedDatasetIds.size(), deletedIdentifiers.size(), failedIdentifiers.size()));

}
} catch (StopHarvestException she) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -297,7 +297,7 @@ public ClientHarvestRun getLastSuccessfulRun() {
int i = harvestHistory.size() - 1;

while (i > -1) {
if (harvestHistory.get(i).isSuccess()) {
if (harvestHistory.get(i).isCompleted() || harvestHistory.get(i).isCompletedWithFailures()) {
return harvestHistory.get(i);
}
i--;
Expand All @@ -314,7 +314,7 @@ ClientHarvestRun getLastNonEmptyRun() {
int i = harvestHistory.size() - 1;

while (i > -1) {
if (harvestHistory.get(i).isSuccess()) {
if (harvestHistory.get(i).isCompleted() || harvestHistory.get(i).isCompletedWithFailures()) {
if (harvestHistory.get(i).getHarvestedDatasetCount().longValue() > 0 ||
harvestHistory.get(i).getDeletedDatasetCount().longValue() > 0) {
return harvestHistory.get(i);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -164,8 +164,13 @@ public void deleteClient(Long clientId) {
}

@TransactionAttribute(TransactionAttributeType.REQUIRES_NEW)
public void setHarvestSuccess(Long hcId, Date currentTime, int harvestedCount, int failedCount, int deletedCount) {
recordHarvestJobStatus(hcId, currentTime, harvestedCount, failedCount, deletedCount, ClientHarvestRun.RunResultType.SUCCESS);
public void setHarvestCompleted(Long hcId, Date currentTime, int harvestedCount, int failedCount, int deletedCount) {
recordHarvestJobStatus(hcId, currentTime, harvestedCount, failedCount, deletedCount, ClientHarvestRun.RunResultType.COMPLETED);
}

@TransactionAttribute(TransactionAttributeType.REQUIRES_NEW)
public void setHarvestCompletedWithFailures(Long hcId, Date currentTime, int harvestedCount, int failedCount, int deletedCount) {
recordHarvestJobStatus(hcId, currentTime, harvestedCount, failedCount, deletedCount, ClientHarvestRun.RunResultType.COMPLETED_WITH_FAILURES);
}

@TransactionAttribute(TransactionAttributeType.REQUIRES_NEW)
Expand Down
23 changes: 15 additions & 8 deletions src/main/java/edu/harvard/iq/dataverse/util/json/JsonPrinter.java
Original file line number Diff line number Diff line change
Expand Up @@ -423,11 +423,17 @@ public static JsonObjectBuilder json(FileDetailsHolder ds) {
}

public static JsonObjectBuilder json(DatasetVersion dsv, boolean includeFiles) {
return json(dsv, null, includeFiles, false);
return json(dsv, null, includeFiles, false,true);
}
public static JsonObjectBuilder json(DatasetVersion dsv, boolean includeFiles, boolean includeMetadataBlocks) {
return json(dsv, null, includeFiles, false, includeMetadataBlocks);
}
public static JsonObjectBuilder json(DatasetVersion dsv, List<String> anonymizedFieldTypeNamesList,
boolean includeFiles, boolean returnOwners) {
return json( dsv, anonymizedFieldTypeNamesList, includeFiles, returnOwners,true);
}

public static JsonObjectBuilder json(DatasetVersion dsv, List<String> anonymizedFieldTypeNamesList,
boolean includeFiles, boolean returnOwners) {
boolean includeFiles, boolean returnOwners, boolean includeMetadataBlocks) {
Dataset dataset = dsv.getDataset();
JsonObjectBuilder bld = jsonObjectBuilder()
.add("id", dsv.getId()).add("datasetId", dataset.getId())
Expand Down Expand Up @@ -472,11 +478,12 @@ public static JsonObjectBuilder json(DatasetVersion dsv, List<String> anonymized
.add("sizeOfCollection", dsv.getTermsOfUseAndAccess().getSizeOfCollection())
.add("studyCompletion", dsv.getTermsOfUseAndAccess().getStudyCompletion())
.add("fileAccessRequest", dsv.getTermsOfUseAndAccess().isFileAccessRequest());

bld.add("metadataBlocks", (anonymizedFieldTypeNamesList != null) ?
jsonByBlocks(dsv.getDatasetFields(), anonymizedFieldTypeNamesList)
: jsonByBlocks(dsv.getDatasetFields())
);
if(includeMetadataBlocks) {
bld.add("metadataBlocks", (anonymizedFieldTypeNamesList != null) ?
jsonByBlocks(dsv.getDatasetFields(), anonymizedFieldTypeNamesList)
: jsonByBlocks(dsv.getDatasetFields())
);
}
if(returnOwners){
bld.add("isPartOf", getOwnersFromDvObject(dataset));
}
Expand Down
7 changes: 7 additions & 0 deletions src/main/java/propertyFiles/Bundle.properties
Original file line number Diff line number Diff line change
Expand Up @@ -636,6 +636,13 @@ harvestclients.viewEditDialog.archiveDescription.tip=Description of the archival
harvestclients.viewEditDialog.archiveDescription.default.generic=This Dataset is harvested from our partners. Clicking the link will take you directly to the archival source of the data.
harvestclients.viewEditDialog.btn.save=Save Changes
harvestclients.newClientDialog.title.edit=Edit Group {0}
harvestclients.result.completed=Completed
harvestclients.result.completedWithFailures=Completed with failures
harvestclients.result.failure=FAILED
harvestclients.result.inProgess=IN PROGRESS
harvestclients.result.deleteInProgress=DELETE IN PROGRESS
harvestclients.result.interrupted=INTERRUPTED
harvestclients.result.details={0} harvested, {1} deleted, {2} failed.

#harvestset.xhtml
harvestserver.title=Manage Harvesting Server
Expand Down
Loading

0 comments on commit b2c918a

Please sign in to comment.