There have been discussions about relaxing the long-established SimRel requirement to sign all the jars/bundles of every project's release train contributions.
The IDE Working Group Steering Committee was asked for an opinion/position on the topic. I.e., are signed jars important to your organization? The general sense is that signing in and of itself is not so important but rather overall security is key, simply stated as "Build artifacts made available at the Eclipse Foundation are verifiably the ones built by respective projects."
The planning council is tasked with deciding on an appropriate strategy (rules) for ensuring that downloaded artifacts are "secure" and are actually verified to be exactly the ones produced by the projects.
Signing definitely serves that purpose and has the advantage of being verifiable even after the artifacts exist on the machine.
That being said, it has the huge disadvantage that it modifies the artifact such that if the consumed jar was already an OSGi bundle, it needs to be given a new version/ID to produce a result conforming to the current rules. It's also a disadvantage that signatures expire, requiring bundles to be signed yet again.
An alternative "external signature" approach has been proposed and prototyped/implemented.
We could adopt this as an alternative approach to signing, perhaps restricting it to those situations where the contributed jars are not built on Eclipse infrastructure and hence are not readily signed as they are today by Tycho builds on Eclipse's CI infrastructure.
Is this alternative approach sufficient? What are the draw backs and advantages? (Where is it documented?)
Or taking one more step back, do we actually need anything beyond secure metadata (https) and SHAs to verify that the artifacts published to a repository are exactly the same ones (the same bytes) downloaded to the client from the internet? Even if we don't strictly need anything in addition to this, do we nevertheless want an additional (alternative) layer of security? After all, we all do signing already so other than the (significantly) longer build times, disabling internal signing now doesn't buy us anything (beyond faster builds).
Designs
Child items
...
Show closed items
Linked items
0
Link issues together to show that they're related or that one is blocking others.
Learn more.
"Build artifacts made available at the Eclipse Foundation are verifiably the ones built by respective projects
So basically, a post-build step that would verify (in a way or another) that what's part of the SimRel site or IDE is correct is enough?
Eg, we don't care about what happens on users' end as long as what's on the infra can be verified?
"Build artifacts made available at the Eclipse Foundation
are verifiably the ones built by respective projects."
Won't this require a "certificate per project", afaik the certificate is the same for all artifacts at the momment (most probably because Eclipse is not a code-signing-certificate authority to create individual ones)
Signing definitely serves that purpose and has the advantage of being
verifiable even after the artifacts exist on the machine.
As suggested in Bug 575540, if we handle signatures as extra files/artifacts a signature can work similar without modify the artifact but still allow to have signing alongside with the artifact.
Or taking one more step back, do we actually need anything beyond secure
metadata (https) and SHAs to verify that the artifacts published to a
repository are exactly the same ones (the same bytes) downloaded to the
client from the internet?
I think signed-jars just give a false security implication as long as very committer has potentially access to (eclipse) signing infrastructure, so from my point of view meta-data is enough (when talking about mirrors) but in general this also does not help much as long one can add any site to eclipse without any verification (you just need one 'bad' update-site and can then install any code you like via patch features).
Even if we don't strictly need anything in addition to this,
do we nevertheless want an additional (alternative) layer
of security? After all, we all do signing already so other than the
(significantly) longer build times, disabling internal signing now doesn't
buy us anything (beyond faster builds).
If we really like to make things more secure, we probably need some kind of PGP-Web-Of-thrust + (crypographically) signed commits so every line of code/change is verified/reviewed by a "completely-trusted" person ... but that's probably far beyond the scope of this.
"Build artifacts made available at the Eclipse Foundation are verifiably the ones built by respective projects
So basically, a post-build step that would verify (in a way or another) that
what's part of the SimRel site or IDE is correct is enough?
Eg, we don't care about what happens on users' end as long as what's on the
infra can be verified?
Well, let's look closely at what I said and what you paraphrased:
"do we actually need anything beyond secure metadata (https) and SHAs to verify that the artifacts published to a repository are exactly the same ones (the same bytes) downloaded to the client from the internet"
!=
"we don't care about what happens on users' end as long..."
If we're are going to treat this process a supreme court with an army of lawyers, where every last word and phrase is parsed into all possible meanings, and where we can pick one of those meanings, one that's effectively nonsensical and contradictory to the original intent, as perhaps the intended meaning, then we can continue with further refinements of the wording during the next meeting to try to word the intent in such a way that no one will be able to misinterpret it such that it's contrary to the original intent.
Of course the fundamental point is that one can (and does) verify that the bytes published by the project are exactly the bytes downloaded to the client. I don't speak for the Steering Committee, but I believe this to be the intent.
"Build artifacts made available at the Eclipse Foundation
are verifiably the ones built by respective projects."
Won't this require a "certificate per project", afaik the certificate is the
same for all artifacts at the momment (most probably because Eclipse is not
a code-signing-certificate authority to create individual ones)\
It's hard to imagine why would one need a certificate per project. That seems clearly unworkable..
Signing definitely serves that purpose and has the advantage of being
verifiable even after the artifacts exist on the machine.
As suggested in Bug 575540, if we handle signatures as extra files/artifacts
a signature can work similar without modify the artifact but still allow to
have signing alongside with the artifact.\
This is a further implementation detail of whether the external signature is a separate "artifact/file" versus a data item in the repository metadata XML (as is the case for the SHAs currently).
Or taking one more step back, do we actually need anything beyond secure
metadata (https) and SHAs to verify that the artifacts published to a
repository are exactly the same ones (the same bytes) downloaded to the
client from the internet?
I think signed-jars just give a false security implication as long as very
committer has potentially access to (eclipse) signing infrastructure, so
from my point of view meta-data is enough (when talking about mirrors) but
in general this also does not help much as long one can add any site to
eclipse without any verification (you just need one 'bad' update-site and
can then install any code you like via patch features).\
Yes, we discussed the fact that signatures don't ensure that the jars themselves don't contain bad logic and bad security loop holes.
The other things you describe are part of the "threat model". For sites hosted at Eclipses, no one can "just add any site" except authorized committers/projects who can edit the sites and provide the bytes for clients to download. For sites hosted elsewhere, all bets or off: any one can sign any bogus thing with any arbitrary signatures and make it look like it's fine and good. Or not?
Even if we don't strictly need anything in addition to this,
do we nevertheless want an additional (alternative) layer
of security? After all, we all do signing already so other than the
(significantly) longer build times, disabling internal signing now doesn't
buy us anything (beyond faster builds).
If we really like to make things more secure, we probably need some kind of
PGP-Web-Of-thrust + (crypographically) signed commits so every line of
code/change is verified/reviewed by a "completely-trusted" person ... but
that's probably far beyond the scope of this.
So we don't actually have any solution at this point because of course nothing is 100% secure when it comes to software. So we only have open ended problems...
OK, I see I got confused by the fact that "verifiably" doesn't explicit whom and when is supposed to be able to verify; but OK to assume it's the user during the installation.
And, in that case, I agree with your comment.
As discussed in bug 575688 (which is so far not really a plan for action), secured p2 metadata over HTTPS with checksums do guarantee we install the right content for a given artifact, and the content that's installed can be verified before restarting the IDE (user can compute checksums locally and compare them with metadata they can find on download.eclipse.org). However, this only works if we ensure that the metadata themselves are only coming from download.eclipse.org; which in the current form is not something we can guarantee because:\
Users usually install from other sources, and the "Contact all software sites" options is checked by default, so other sources -and maybe some less reliable ones- can influence the result and push their artifacts.\
As far as I know, a p2 repository on download.eclipse.org may be able to reference external sources (in composite or repository-references). However I don't think this is really happening in practice.
If the goal is purely to establish verifiability, then signatures are not relevant at all. Signatures are relevant when it comes to trust, and it doesn't seem like building trust strategies is a priority. Basically, things like PGP seem out of the scope.
(In reply to Christoph Laeubrich from comment #2)\
(In reply to Ed Merks from comment #0)\
"Build artifacts made available at the Eclipse Foundation
are verifiably the ones built by respective projects."
Won't this require a "certificate per project", afaik the certificate is the
same for all artifacts at the momment (most probably because Eclipse is not
a code-signing-certificate authority to create individual ones)
It's hard to imagine why would one need a certificate per project. That
seems clearly unworkable..
Sorry for possible confusion, I don't mean project in the sense of an (source) project you import into the IDE but "eclipse projects" like, jetty, platform, ... or how should one understand the "know what respective (eclipse) project has build" if all use the same signature/certificat/...? At least at maven artifacts I have seen individual projects using individual PGP keys so it could be part of the project setup to have a dedicated key for each project.
Signing definitely serves that purpose and has the advantage of being
verifiable even after the artifacts exist on the machine.
As suggested in Bug 575540, if we handle signatures as extra files/artifacts
a signature can work similar without modify the artifact but still allow to
have signing alongside with the artifact.
This is a further implementation detail of whether the external signature is
a separate "artifact/file" versus a data item in the repository metadata XML
(as is the case for the SHAs currently).
I more can think of (what is an extension to my original request!) that P2 could even 'install' the PGP signature into an eclipse so it could be queried/checked even at startup/runtime through P2 query API.
The other things you describe are part of the "threat model". For sites
hosted at Eclipses, no one can "just add any site" except authorized
committers/projects who can edit the sites and provide the bytes for clients
to download. For sites hosted elsewhere, all bets or off: any one can sign
any bogus thing with any arbitrary signatures and make it look like it's
fine and good. Or not?
I mean what mickael also phrased out that currently any IU is installed/updated from any source (isn't it?) so maybe it would be good to have a property in the metadata that states artifacts installed from this source are only allowed to be updated/installed from a given set of "save hosts this could contain wildcards like eclipse.org/updates/platform/* or something
So we don't actually have any solution at this point because of course
nothing is 100% secure when it comes to software. So we only have open
ended problems...
No I just wanted to note that we already have good ways to make sure the artifact is not tampered (md5, sha256, pgp,...) if we want more than "the artifact is not modified since it was published in this metadata" it will require a lot more work.
OK, I see I got confused by the fact that "verifiably" doesn't explicit whom
and when is supposed to be able to verify; but OK to assume it's the user
during the installation.
And, in that case, I agree with your comment.
As discussed in bug 575688 (which is so far not really a plan for action),
secured p2 metadata over HTTPS with checksums do guarantee we install the
right content for a given artifact, and the content that's installed can be
verified before restarting the IDE (user can compute checksums locally and
compare them with metadata they can find on download.eclipse.org). However,
this only works if we ensure that the metadata themselves are only coming
from download.eclipse.org; which in the current form is not something we can
guarantee because:\
Users usually install from other sources, and the "Contact all software
sites" options is checked by default, so other sources -and maybe some less
reliable ones- can influence the result and push their artifacts.
I think this is a super important point you make. The "threat model" here is that someone provides an update site that has artifacts with IDs that match the artifact IDs distributed by some Eclipse project and while those "alternate artifacts" might well have valid checksums and their metadata might well be secured by https, nevertheless the artifact bytes could well be different bytes than the ones actually produced by an Eclipse project. That seems like a problem we should prevent.
As far as I know, a p2 repository on download.eclipse.org may be able to
reference external sources (in composite or repository-references). However
I don't think this is really happening in practice.\
Yes, this seems a little more far fetched. We're really not supposed to be redistributing arbitrary content that hasn't been reviewed/approved.
If the goal is purely to establish verifiability, then signatures are not
relevant at all. Signatures are relevant when it comes to trust, and it
doesn't seem like building trust strategies is a priority. Basically, things
like PGP seem out of the scope.
Yes, though I wonder here if signatures are really about trust versus about "certification of origin". It's already been pointed out that trust is a pretty strong word, i.e., "we (Eclipse) signed this jar so you can trust it's secure." But can we really trust that the software actual functions securely? Not so much I think...
I bring this up because, in relation to your point 1. above, I think signatures (internal and/or external) prevent substitution of artifacts except by those that are also signed and therefore also associated with a particular origin.
Yes, though I wonder here if signatures are really about trust versus about
"certification of origin". It's already been pointed out that trust is a
pretty strong word, i.e., "we (Eclipse) signed this jar so you can trust
it's secure." But can we really trust that the software actual functions
securely? Not so much I think...
As mentioned above we can't really trust "eclipse", but that's why I think there should not be a "global" eclipse certificate but projects specific ones, because I might trust 'tycho' as I regular follow the code stream and review the code changes, but I won't trust an arbitrary eclipse project because I don't know anything about it.
So "certification of origin" can also be a source of "trust"
I bring this up because, in relation to your point 1. above, I think
signatures (internal and/or external) prevent substitution of artifacts
except by those that are also signed and therefore also associated with a
particular origin.
That sounds like a good idea, that one artifact can only be updated by the same "origin" (or at least give the user a strong warning if not). Again here it would be usefull if it is more fine grained than "eclipse-or-not-eclipse"
(In reply to Christoph Laeubrich from comment #2)\
(In reply to Ed Merks from comment #0)\
"Build artifacts made available at the Eclipse Foundation
are verifiably the ones built by respective projects."
Won't this require a "certificate per project", afaik the certificate is the
same for all artifacts at the momment (most probably because Eclipse is not
a code-signing-certificate authority to create individual ones)
It's hard to imagine why would one need a certificate per project. That
seems clearly unworkable..
Sorry for possible confusion, I don't mean project in the sense of an
(source) project you import into the IDE but "eclipse projects" like, jetty,
platform, ... or how should one understand the "know what respective
(eclipse) project has build" if all use the same signature/certificat/...?
At least at maven artifacts I have seen individual projects using individual
PGP keys so it could be part of the project setup to have a dedicated key
for each project.\
I don't believe it's important to know which project produced the artifacts but rather to know that some Eclipse project produced the artifacts, i.e., the certification of origin back to Eclipse. There's typically much data in the artifacts to track them back to projects.
\
Signing definitely serves that purpose and has the advantage of being
verifiable even after the artifacts exist on the machine.
As suggested in Bug 575540, if we handle signatures as extra files/artifacts
a signature can work similar without modify the artifact but still allow to
have signing alongside with the artifact.
This is a further implementation detail of whether the external signature is
a separate "artifact/file" versus a data item in the repository metadata XML
(as is the case for the SHAs currently).
I more can think of (what is an extension to my original request!) that P2
could even 'install' the PGP signature into an eclipse so it could be
queried/checked even at startup/runtime through P2 query API.\
Indeed, external signatures could be "installed" to be available for later use; perhaps just saved as part of the profile which is effectively an artifact repository.
The other things you describe are part of the "threat model". For sites
hosted at Eclipses, no one can "just add any site" except authorized
committers/projects who can edit the sites and provide the bytes for clients
to download. For sites hosted elsewhere, all bets or off: any one can sign
any bogus thing with any arbitrary signatures and make it look like it's
fine and good. Or not?
I mean what mickael also phrased out that currently any IU is
installed/updated from any source (isn't it?) so maybe it would be good to
have a property in the metadata that states artifacts installed from this
source are only allowed to be updated/installed from a given set of "save
hosts this could contain wildcards like eclipse.org/updates/platform/* or
something\
That sounds significantly complicated and then is fragile when organization mirror repositories.
So we don't actually have any solution at this point because of course
nothing is 100% secure when it comes to software. So we only have open
ended problems...
No I just wanted to note that we already have good ways to make sure the
artifact is not tampered (md5, sha256, pgp,...) if we want more than "the
artifact is not modified since it was published in this metadata" it will
require a lot more work.
Yes, though I wonder here if signatures are really about trust versus about
"certification of origin". It's already been pointed out that trust is a
pretty strong word, i.e., "we (Eclipse) signed this jar so you can trust
it's secure." But can we really trust that the software actual functions
securely? Not so much I think...
As mentioned above we can't really trust "eclipse", but that's why I think
there should not be a "global" eclipse certificate but projects specific
ones, because I might trust 'tycho' as I regular follow the code stream and
review the code changes, but I won't trust an arbitrary eclipse project
because I don't know anything about it.
So "certification of origin" can also be a source of "trust"\
This seems to me way more complicated than what we currently have and, given the stacked nature of dependencies, trusting one project but not another just doesn't seem feasible. E.g., if one don't trust EMF or ECF, then there is no release train...
I bring this up because, in relation to your point 1. above, I think
signatures (internal and/or external) prevent substitution of artifacts
except by those that are also signed and therefore also associated with a
particular origin.
That sounds like a good idea, that one artifact can only be updated by the
same "origin" (or at least give the user a strong warning if not). Again
here it would be usefull if it is more fine grained than
"eclipse-or-not-eclipse"
I'm not sure how far off the deep end we need to go on this. People will install bundles from other sites. The marketplace has 1000 of them. I believe human beings will not be making fine grained decisions about each bundle but rather course grained ones. "Is this bundle from an origin that certifies that this thing I'm installing comes from that origin"? And this information is generally not even presented to the users. Mostly licenses and missing certification are presented...
Keep in mind that this issue arose to reduce the effort around contributing to SimRel. Ideally we don't make life more complicated but also we don't make things less secure either...
This seems to me way more complicated than what we currently have and, given
the stacked nature of dependencies, trusting one project but not another
just doesn't seem feasible. E.g., if one don't trust EMF or ECF, then there
is no release train...
Security comes at a price... I won't expect someone to install EMF or ECF but using some kind of pre-build package, which can ship with the necessary trusted "root-certificates"...
\
I bring this up because, in relation to your point 1. above, I think
signatures (internal and/or external) prevent substitution of artifacts
except by those that are also signed and therefore also associated with a
particular origin.
That sounds like a good idea, that one artifact can only be updated by the
same "origin" (or at least give the user a strong warning if not). Again
here it would be usefull if it is more fine grained than
"eclipse-or-not-eclipse"
I'm not sure how far off the deep end we need to go on this. People will
install bundles from other sites. The marketplace has 1000 of them. I
believe human beings will not be making fine grained decisions about each
bundle but rather course grained ones.
I won't expect each bundle to be decided over, but if the SSL certificate of a host changes after I first have contacted it I get a strong warning that something might be wrong.
Decisions in a "web-of-thrust" can also be hierarchical, so if I thrust "Eclipse" and Eclipse has claimed high trust in "Apache" I can also thrust "Apache" bundles.
"Is this bundle from an origin that certifies that this thing I'm
installing comes from that origin"? And this
information is generally not even presented to the users. Mostly licenses
and missing certification are presented...
But how is A PGP signature different from a certificate? So I think the most "unobtrusive" option would be to handle them the same here. If a PGP signature of a key completely unknown to the user a dialog will ask if the user likes to trust it (that's how it work currently but without the choice to permanently trust it as far as I know) and then in the future I don't get bothered.
Keep in mind that this issue arose to reduce the effort around contributing
to SimRel. Ideally we don't make life more complicated but also we don't
make things less secure either...
Security comes at a price... Given that Maven-Central enforces PGP signed artifacts and there are hundreds of thousands artifacts and projects contributing content there it seems setting up a PGP key can be assumed not so complicated.
(In reply to Christoph Laeubrich from comment #11)
(In reply to Ed Merks from comment #10)\
This seems to me way more complicated than what we currently have and, given
the stacked nature of dependencies, trusting one project but not another
just doesn't seem feasible. E.g., if one don't trust EMF or ECF, then there
is no release train...
Security comes at a price... I won't expect someone to install EMF or ECF
but using some kind of pre-build package, which can ship with the necessary
trusted "root-certificates"...\
The installer installs everything from scratch so people do in fact do exactly this kind of "decision" whenever they install a package via the installer.
And I thought the root certificates are in the JRE/JDK?
\
I bring this up because, in relation to your point 1. above, I think
signatures (internal and/or external) prevent substitution of artifacts
except by those that are also signed and therefore also associated with a
particular origin.
That sounds like a good idea, that one artifact can only be updated by the
same "origin" (or at least give the user a strong warning if not). Again
here it would be usefull if it is more fine grained than
"eclipse-or-not-eclipse"
I'm not sure how far off the deep end we need to go on this. People will
install bundles from other sites. The marketplace has 1000 of them. I
believe human beings will not be making fine grained decisions about each
bundle but rather course grained ones.
I won't expect each bundle to be decided over, but if the SSL certificate of
a host changes after I first have contacted it I get a strong warning that
something might be wrong.\
What you outline does suggest that when installing many dozens of projects that I will be making dozens of trust decisions. Each EPP package is composed from a great many projects. Even the Platform SDK is many projects. So multiple trust decisions sounds rather unworkable...
Decisions in a "web-of-thrust" can also be hierarchical, so if I thrust
"Eclipse" and Eclipse has claimed high trust in "Apache" I can also thrust
"Apache" bundles.\
Now we've gone away from per-project back to coarse grained "hosts". I guess my problem here is that I don't actually concretely know all the details of the PGP proposal and perhaps I'm making some poor assumptions as a result...
"Is this bundle from an origin that certifies that this thing I'm
installing comes from that origin"? And this
information is generally not even presented to the users. Mostly licenses
and missing certification are presented...
But how is A PGP signature different from a certificate?
I didn't suggest they are different. I assumed they are certificate-based as are internal signatures. But again, not that I'm making assumptions.
So I think the most
"unobtrusive" option would be to handle them the same here. If a PGP
signature of a key completely unknown to the user a dialog will ask if the
user likes to trust it (that's how it work currently but without the choice
to permanently trust it as far as I know) and then in the future I don't get
bothered.\
Currently the user is prompted only if the root certificate isn't known in the JRE. I thought that would still be the case. Perhaps I'm making yet another bad assumption.
Keep in mind that this issue arose to reduce the effort around contributing
to SimRel. Ideally we don't make life more complicated but also we don't
make things less secure either...
Security comes at a price... Given that Maven-Central enforces PGP signed
artifacts and there are hundreds of thousands artifacts and projects
contributing content there it seems setting up a PGP key can be assumed not
so complicated.
I didn't suggest that PGP signing is complicated. I was under the impression that it is already implemented. It's all this per-project trust/certification stuff that starts to seem complicated to me. Also, I believe the current thinking was that this new alternative external signing is only used for things not already easily signed as we've been doing for years. But perhaps you're thinking differently on that front too...
The scope here seems to be not what I expected - this conversation started as (from Mickael's email[1]) a request to drop jarsigning requirement (with the open ended question of what security do we want for simrel). But now seems to have transformed into discussions about rearchitecting the entire security model of p2.
The IDE WG and the Planning Council are clearly in favour of change, and the requirement is very short. Note that nothing in the requirement from the IDE WG says how it needs to be verifiable, that can require multiple steps.
For example, a project contributes to SimRel, SimRel does a mirror operation. That is verifiable today and does not require signing. Then Eclipse IDE installs from SimRel, and it is possible using the hashes to verify that what was downloaded was what was expected.
Why is it more complicated than that to meet the requirement from WG? Which is back to last para from Comment 0 which seems to have one answer:
(Christoph Laeubrich from comment #2)
from my point of view meta-data is enough
The scope here seems to be not what I expected - this conversation started
as (from Mickael's email[1]) a request to drop jarsigning requirement (with
the open ended question of what security do we want for simrel). But now
seems to have transformed into discussions about rearchitecting the entire
security model of p2.\
Yes, I was hoping to bound the problem not have it snowball into something bigger.
The IDE WG and the Planning Council are clearly in favour of change, and the
requirement is very short.
Yes, though apparently easy to misinterpret...
Note that nothing in the requirement from the IDE
WG says how it needs to be verifiable, that can require multiple steps.
For example, a project contributes to SimRel, SimRel does a mirror
operation. That is verifiable today and does not require signing. Then
Eclipse IDE installs from SimRel, and it is possible using the hashes to
verify that what was downloaded was what was expected. \
Yes, though Mickael had an important observation. If the "available" update sites includes some non-eclipse-hosted site, the artifact metadata for some arbitrary installable unit could well come from such an alternate site in which case the artifact itself could be some alternative bits version of the artifact (with the correct hash sum in that alternative artifact metadata) so nothing is noticed as bad or wrong...
Do you see what I/he means?
Such spoofing is possible even using https, and is even possible when signing is enforced, but of course in the case of signing, the alternative jar would then also need to be signed (by something other than Eclipses though), which at least would make it more trackable...
Do we care about any of this?
Why is it more complicated than that to meet the requirement from WG? Which
is back to last para from Comment 0 which seems to have one answer:
\
(Christoph Laeubrich from comment #2)
from my point of view meta-data is enough
Yes - I understand this use case. It is certainly a problem that Eclipse community can look to solve. @Mikael/@Ed do you see this as blocking to change the install model? If not, lets split that issue out. The issue is not new, it is orthogonal to the issue here, and also does not require signing to solve.
Such spoofing is possible even using https, and is even possible when
signing is enforced, but of course in the case of signing, the alternative
jar would then also need to be signed (by something other than Eclipses
though), which at least would make it more trackable...
Do we care about any of this?
Yes. We want to make sure that users install what they think they are installing. If I (as a user) put a whole bunch of p2 sites in my Available Sites list that is me saying I am happy with the content of these p2 sites. So the only thing we should care about is that the artifacts are the ones in these p2 sites as determined by the p2 metadata these sites serve up (this is why Bug 575688 is important - Eclipse needs to make sure it is downloading the metadata the user intended)
Us (as the maintainers of SimRel and Eclipse projects) have a responsibility that everything made available on p2 sites located on download.eclipse.org are what we intended. That is done by delegating trust to individual projects to only place on download.eclipse.org valid artifacts, and it is done by the Webmaster securing access to download.eclipse.org as they do now with SSH via the CI infra.
If we assume that users do trust the site they're adding to their available software or when installing content (they can verify those sites in advance for trust - eg by looking at metadata), then it doesn't become an issue if one site "leaks" artifact in the resolution: it's consistent with user decision to trust a site.
When all sites are trusted, there is no point in further signature and checks IMO, just like most p2 repo have unsigned content and most users just click the "Install anyway" (read "I don't care") button without verifying just because they trust their providers.
Wherever artifacts do come from, it's always possible to verify them just after installation and before a restart (ie before newly installed content is activated and executed) by comparing local installation to what's expected from the reference provider.
If we assume that users do trust the site they're adding to their available
software or when installing content (they can verify those sites in advance
for trust - eg by looking at metadata)
P2 sites can also come from p2-touchpoint instructions when installing stuff if I remember correctly.
Anyways as you mentioned I think there are two "user-groups"
who don't care and click any button to "get that stuff installed" as they think it is useful\
who are not allowed to install stuff because of company policies
(In reply to Christoph Laeubrich from comment #17)\
P2 sites can also come from p2-touchpoint instructions when installing stuff
if I remember correctly.
p2 touchpoints are visible in the metadata, similarly to repository
references. So this can be verified even before installation by those who
care.
Do you really know anyone reading and fully understand p2 metadata.xml before adding an update-site? One even needs to investigate each artifact if it contains a p2.inf to be absolutely sure...