Maxon Developers Maxon Developers
    • Documentation
      • Cinema 4D Python API
      • Cinema 4D C++ API
      • Cineware API
      • ZBrush Python API
      • ZBrush GoZ API
      • Code Examples on Github
    • Forum
    • Downloads
    • Support
      • Support Procedures
      • Registered Developer Program
      • Plugin IDs
      • Contact Us
    • Categories
      • Overview
      • News & Information
      • Cinema 4D SDK Support
      • Cineware SDK Support
      • ZBrush 4D SDK Support
      • Bugs
      • General Talk
    • Unread
    • Recent
    • Tags
    • Users
    • Login

    Possible to query color with UV coordinate?

    SDK Help
    0
    10
    823
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • H
      Helper
      last edited by

      On 26/07/2016 at 15:51, xxxxxxxx wrote:

      User Information:
      Cinema 4D Version:   16 
      Platform:   Windows  ;   
      Language(s) :     C++  ;

      ---------

      Hello friends,

      I've been doing a bit of research into this, but I'm a shade or two out of my element and before I got much deeper I thought it'd be wise to ask if it's possible and, if so, in what general direction would be best to start.  So:

      I have a mesh with a camera-projected material on it.  Is it possible to read the rgb values of an arbitrary location on said mesh with only a uv coordinate?  It can't be baked, is the big catch; the projected material is informed by an image sequence, and we won't have time to bake the whole thing like you usually would to accomplish this the easy way.

      I keep finding topics, here, that *almost* touch on this, but if it has been answered already I apologize--I probably didn't understand what I was reading ^^

      Thanks very much in advance for any advice!

      1 Reply Last reply Reply Quote 0
      • H
        Helper
        last edited by

        On 27/07/2016 at 05:56, xxxxxxxx wrote:

        Hi,

        this doesn't sound trivial. Can you give us some more details? Which rgb values are you interested in? Just the values from the image sequence or the final value in the rendered image?

        1 Reply Last reply Reply Quote 0
        • H
          Helper
          last edited by

          On 27/07/2016 at 08:43, xxxxxxxx wrote:

          Hey Andreas!

          From the image sequence, though in my particular case those two would be nearly synonymous, since the material is only said sequence in the luminance channel, and there are no lights, etc.

          Just for clarity: I have a UV coordinate (let's say 0.5, 0.5).  If I "selected" that coordinate (setting aside the fact that there's no such thing as selecting uv coordinates) and saw at what location it corresponded to on the mesh, is it possible to query the mesh's surface color, there?  (I realize "query the mesh's surface color" is a very loaded thing to say:)

          Thanks very much; lemme know if I'm possibly leaving anything out/not making sense.

          1 Reply Last reply Reply Quote 0
          • H
            Helper
            last edited by

            On 29/07/2016 at 11:50, xxxxxxxx wrote:

            Is this maybe a tiny bit too ambitious?  😊

            1 Reply Last reply Reply Quote 0
            • H
              Helper
              last edited by

              On 01/08/2016 at 07:20, xxxxxxxx wrote:

              Hi Whithers,

              sorry, it took a bit longer.
              If you are really only interested in the rgb value of the Bitmap shader at certain uv coordinates, you can simply sample the shader.
              If instead the surface color resulting from a mixture of other channels is needed, then you'd need to sample the material. Which is actually something thought to be done inside the rendering pipeline, so you'd need to create your own VolumeData.

              1 Reply Last reply Reply Quote 0
              • H
                Helper
                last edited by

                On 01/08/2016 at 08:12, xxxxxxxx wrote:

                The code I just posted sounds a lot like what you are ask for:
                https://developers.maxon.net/forum/topic/9630/12933_convert-barycentric-coords-to-uv-coords

                -ScottA

                1 Reply Last reply Reply Quote 0
                • H
                  Helper
                  last edited by

                  On 02/08/2016 at 18:17, xxxxxxxx wrote:

                  Not at all, Andreas, I saw how much you had to do last week ^^
                  Yeah, I saw your post too, Scott; what a happy coincidence for me that you were doing something similar enough for me to piggy-back on and learn from.

                  For future googlers I'll post the code of interest that pertains to this question when I get it all working the way I want (if I don't have more questions, first).  Thanks so much for the guidance, gents, I can't tell you how much I appreciate the help.

                  1 Reply Last reply Reply Quote 0
                  • H
                    Helper
                    last edited by

                    On 03/08/2016 at 12:49, xxxxxxxx wrote:

                    Ok, still going, and I've learned a lot, but a quick update:

                    1. Now that I understand a little better what I'm doing, I can see the code you posted, Andreas, just samples the shader with a uv coordinate alone--which is kind of what I said, but because the material is camera-projected, I need to query the surface somehow...

                    2. It looks like you're running the inverse of what I'm looking to do, Scott.  Usually when that's the case you can just invert the whole process, but I don't know if I can since you're starting with a ray and converting that point to uv space, where I guess what I need to figure out is how to start with a uv coordinate and convert that to object space (though not even that, 'cause I then have to figure out what the color is, there).

                    Hmmm

                    Edit: Yeah, I'm starting to think I'm treading waters too deep.  Even if I managed to take an object and translate an arbitrary uv coordinate into object/barycentric space, figuring out how to sample the projected material at that point would probably melt my brain, if it's possible to begin with 😕

                    Double Edit: Ooohh, sorry, I just remembered you also mentioned the VideoPostData class; I wonder if that could do it.  Hope springs anew, heheh, I'll check it out.

                    1 Reply Last reply Reply Quote 0
                    • H
                      Helper
                      last edited by

                      On 03/08/2016 at 14:55, xxxxxxxx wrote:

                      Remo posted some code that samples UVs a while back:
                      https://github.com/PluginCafe/utilities-and-snippets/blob/master/C%2B%2B/Remo/SamplerRemo.h

                      You just add his sampler.h file to your VS plugin project and use #include "sampler.h" to use it in your plugin's .cpp file.

                      This will skip the ray casting and barycentric stuff. But you will need to provide the UV coords.
                      The code is fairly scary looking. And it might not be obvious how to use it.
                      It wasn't  obvious to me. So I made a GeDialog plugin example with it that uses two sliders (X&Y) to pick the UV coordinates to sample the shader's colors.
                      The source code is included so you can see how I implemented his code.
                      http://www36.zippyshare.com/v/AmMfLCrF/file.html

                      -ScottA

                      1 Reply Last reply Reply Quote 0
                      • H
                        Helper
                        last edited by

                        On 03/08/2016 at 17:10, xxxxxxxx wrote:

                        Thanks Scott, wow, this is gold.  This isn't the only task on my plate, so it may take me a couple days, but I will report on my progress as I go deeper, for posterity.

                        1 Reply Last reply Reply Quote 0
                        • First post
                          Last post