Replies: 2 comments 2 replies
-
Thanks Jamie. @RaggedStaff Can you provide more details about the objective of the project? |
Beta Was this translation helpful? Give feedback.
2 replies
-
@lecoqlibre has proposed to introduce mixins to the Semantizer. We can utilise these from the connector (prob via a new static module) controlled by a boolean env variable, that defaults to FALSE. We will also require further configuration (i.e. where to send the data) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Objective
Enable remote data capture functionality in the DFC connector, as requested by
the FDC Governance Circle, so that data may be captured within the DFC
Network and relayed to an independent triple store that will act as a Data
Commons.
Proposal
While we discussed the possible necessity of incorporating the data capture
mechanism into the code generator's templates, I've realized that may not ever
be necessary or even desirable. In all three implementations of the connector,
the core request/response logic can be found within the main
Connector
classor its modules (such as the
JsonldStream
importer and exporter in the case ofthe TypeScript implementation), which are all contained within the static code
directories and not produced through code generation. Because these
import/export methods are indirectly invoked by all semantic object subclasses'
getters, setters, adders and removers, it would be the ideal place to inject
optional hooks that could extend the import/export behavior.
A good model for this kind of extension might be the axios library's
interceptor pattern:
Internally the axios interceptors are private members of the
InterceptorManager
, with a separate instantiation for both request andresponse cycle. The interceptors can also be "ejected":
Some consideration should be given to the API for the connector and the
corresponding getters and setters that will actually invoke the capturing logic.
The getters and setter can differ in behavior, with some being synchronous and
others asynchronous, while the capturing behavior will always be asynchronous.
But we could generally take an approach such as the following:
Where
logger
could be a function (or two functions, to handle both success anderror results), or an instance of a
Logger
class with a wider variety ofconfigurable options, or both.
As for the triple store, where logs will be sent, there are a lot of options. To
begin, the DFC prototype could be used for running integration tests in the
local development environment. If that achieves much of desired outcomes, a fork
of that could be prepared for deployment. A more customized solution could be
built with SemApps, but might require more development. Another extenuating
factor is the degree to which OFN's stakeholders would like this store to be
integrated with OFN's core software and regional server instances.
As for the triple store to send logs to, there are many options, depending upon
how tightly integrated with the core OFN software and server instances OFN's
stakeholders wish this to be, as opposed to a totally independent server that
core OFN knows nothing about. It may be more difficult to judge with much
accuracy the cost and time required to stand up a maintainable instance of the
triple store based on these decisions and a more detailed conversation. In any
case, however, the proposed logging interceptor should work just the same, since
the only parameter it will strictly require should be a location to send the
logs to. Different logging interceptors can be adapted to different behaviors as
desired, and even combined, since this would enable multiple interceptors. The
flexibility of the interceptor pattern may in fact allow for more incremental
development of the triple store and how it is deployed to production.
Requirements
.import.use()
andexport.use()
method, a general interfacefor the function or
Interceptor
class they would each accept as arguments,and the implementations of those functions or classes as the actually
ImportLogger
andExportLogger
. Obviously, the names for all these classesand methods can be decided upon later. These will first be implemented in
TypeScript.
implementation(s), extending the existing TypeScript connector tests as
appropriate. These will only mock the intended triple store behavior.
local instance of a triple store, possibly based on the DFC prototype or
SemApps, that can receive and store JSON-LD logs. Preferably this local
instance will be containerized so it's easy to replicate on a staging server,
or perhaps as the basis for store that can eventually go into production for
the data commons.
Milestones
import.use()
andexport.use()
methods, interfaces,classes, and corresponding unit tests.
and the data capture interceptors specifically.
Estimated Time and Cost
Milestones 1 and 2 will each require roughly 15 hours of development time, and
their order is more or less interchangeable. Depending decisions on how best to
develop, test, and deploy the triple store, milestone 3 could vary widely,
potentially as little as 6-12 development hours, or over 30 dev hours, if more
customization is required beyond simply running an off-the-shelf solution.
Similarly, milestone 4 is difficult to assess at this time, but would require at
least the same amount of dev hours, possibly more.
The contingencies in milestones 2 and 3 makes this a very imprecise estimation,
costing anywhere from $4,410 to $12,600 and taking 1 to 3 months to
complete. We can speak in further detail on the expectations for the triple
store as we go ahead with the connector features, or wait until a clearer set of
requirements can be determined for all 3 milestones.
Beta Was this translation helpful? Give feedback.
All reactions