From mboxrd@z Thu Jan 1 00:00:00 1970 From: Brendan Tildesley Subject: bug#35575: logo,Some graphical programs borked with Guix on Arch Date: Sun, 29 Mar 2020 15:00:57 +1100 Message-ID: References: Mime-Version: 1.0 Content-Type: text/plain; charset=utf-8; format=flowed Content-Transfer-Encoding: 8bit Return-path: Received: from eggs.gnu.org ([2001:470:142:3::10]:54943) by lists.gnu.org with esmtp (Exim 4.90_1) (envelope-from ) id 1jIkHr-0003fO-La for bug-guix@gnu.org; Sun, 29 Mar 2020 22:36:28 -0400 Received: from Debian-exim by eggs.gnu.org with spam-scanned (Exim 4.71) (envelope-from ) id 1jIkHq-0003bt-9Z for bug-guix@gnu.org; Sun, 29 Mar 2020 22:36:27 -0400 Received: from debbugs.gnu.org ([209.51.188.43]:48642) by eggs.gnu.org with esmtps (TLS1.0:RSA_AES_128_CBC_SHA1:16) (Exim 4.71) (envelope-from ) id 1jIkHq-0003bV-5N for bug-guix@gnu.org; Sun, 29 Mar 2020 22:36:26 -0400 Received: from Debian-debbugs by debbugs.gnu.org with local (Exim 4.84_2) (envelope-from ) id 1jIkHq-0004i0-3D for bug-guix@gnu.org; Sun, 29 Mar 2020 22:36:26 -0400 In-Reply-To: Resent-Message-ID: Content-Language: en-US List-Id: Bug reports for GNU Guix List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Errors-To: bug-guix-bounces+gcggb-bug-guix=m.gmane-mx.org@gnu.org Sender: "bug-Guix" To: 35575@debbugs.gnu.org To follow up on this old bug, I believe the issue may come from here: https://gitlab.freedesktop.org/mesa/mesa/-/blob/master/src/compiler/glsl/shader_cache.cpp#L144 Mesa calculates a sha1 based on some things they reason affect the output, but likely it is not truely a function of every parameter than can make a difference to the shader output. When we updated from llvm6 to lvm7 I'm guessing it changed the shaders somehow, and the llvm version is not included in the hash. Since I have zero understanding mesa, I'm not capable of determining the best solution. One thought is that if we included the mesa /gnu/store path in the calculation, this would make the hash's truely unique for a given mesa version, but also cached shaders that /would/ work would be routinely discarded after an update (i assume?). Would this be sensible or completely break something else? Should we just add the llvm version, or just start a mesa bug report asking for input? The code: ralloc_asprintf_append(&buf, "tf: %d ", prog->TransformFeedback.BufferMode);    for (unsigned int i = 0; i < prog->TransformFeedback.NumVarying; i++) {       ralloc_asprintf_append(&buf, "%s ", prog->TransformFeedback.VaryingNames[i]);    }    /* SSO has an effect on the linked program so include this when generating     * the sha also.     */    ralloc_asprintf_append(&buf, "sso: %s\n",                           prog->SeparateShader ? "T" : "F");    /* A shader might end up producing different output depending on the glsl     * version supported by the compiler. For example a different path might be     * taken by the preprocessor, so add the version to the hash input.     */    ralloc_asprintf_append(&buf, "api: %d glsl: %d fglsl: %d\n",                           ctx->API, ctx->Const.GLSLVersion,                           ctx->Const.ForceGLSLVersion);    /* We run the preprocessor on shaders after hashing them, so we need to     * add any extension override vars to the hash. If we don't do this the     * preprocessor could result in different output and we could load the     * wrong shader.     */    char *ext_override = getenv("MESA_EXTENSION_OVERRIDE");    if (ext_override) {       ralloc_asprintf_append(&buf, "ext:%s", ext_override);    }    /* DRI config options may also change the output from the compiler so     * include them as an input to sha1 creation.     */    char sha1buf[41];    _mesa_sha1_format(sha1buf, ctx->Const.dri_config_options_sha1);    ralloc_strcat(&buf, sha1buf);    for (unsigned i = 0; i < prog->NumShaders; i++) {       struct gl_shader *sh = prog->Shaders[i];       _mesa_sha1_format(sha1buf, sh->sha1);       ralloc_asprintf_append(&buf, "%s: %s\n", _mesa_shader_stage_to_abbrev(sh->Stage), sha1buf);