Maxon Developers Maxon Developers
    • Documentation
      • Cinema 4D Python API
      • Cinema 4D C++ API
      • Cineware API
      • ZBrush Python API
      • ZBrush GoZ API
      • Code Examples on Github
    • Forum
    • Downloads
    • Support
      • Support Procedures
      • Registered Developer Program
      • Plugin IDs
      • Contact Us
    • Categories
      • Overview
      • News & Information
      • Cinema 4D SDK Support
      • Cineware SDK Support
      • ZBrush 4D SDK Support
      • Bugs
      • General Talk
    • Unread
    • Recent
    • Tags
    • Users
    • Login

    Bitmap Distortion Shader Example Questiosn

    SDK Help
    0
    2
    130
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • H
      Helper
      last edited by

      On 21/03/2013 at 15:06, xxxxxxxx wrote:

      User Information:
      Cinema 4D Version:   13 
      Platform:    Mac  ;  
      Language(s) :     C++  ;

      ---------
      Hello all, back with a couple more question, this time about the Bitmap Distortion Shader.

      I've been experimenting with it and I've found some things that I don't understand.

      Does ShaderData::Output() browse through the points in any specific order?  From the print tests I've done it seems to be essentially random.

      How would I find adjacent points to the current cd->p point?  And what defines a point?  They don't seem to be evenly spaced or anything from what I can tell.

      Thanks for any possible help. 
      Dan

      1 Reply Last reply Reply Quote 0
      • H
        Helper
        last edited by

        On 21/03/2013 at 18:21, xxxxxxxx wrote:

        as nobody is answering here, i just drop two things.

        1. ChannelData coordinates are in UVW space. you could theoretically sample neighbouring
        areas by calling output just with a modified ChannelData.uVW value. getting the sample for
        neighbouring pixels can be quite complicated i guess (as you would have to do the raytracing).

        2. the samples are not evenly spaced due to projection a mesh curvature. for a planar
        surface with an orthgonal frontal projection output should e called in evenly spreaded steps.

        1 Reply Last reply Reply Quote 0
        • First post
          Last post