Showing posts with label add-ons. Show all posts
Showing posts with label add-ons. Show all posts

Colinearity tests in Blender meshes using Numpy

I re-implemented the algorithm used in the select_colinear_edges add-on to select all edges that are co-linear with already selected edges, and I thought a little write-up with some details could be useful for some people.

Warning! Long read! (≈ 20 min)

The challenge

If we want to select a path of co-linear edges all we have to do is start from any already selected edge, check if its neighbor is co-linear and if it is, select it and proceed from there. If we are careful not to examine any edges more than once, this algorithm will be quite fast and the time will depend on the number of directly connected edges that prove to be co-linear. And even in a large mesh this is likely to be a small number.

But what if we do not require those connected edges to form an unbroken path?

Then for all initially selected edges we would have to test all other edges in the mesh for co-linearity, something that can take a very long time if the mesh contains millions of vertices and everything is implemented in Python using just the mathutils module.

The algorithm

How do you determine if two edges are co-linear?

The first step is to see if they are parallel. This is done by calculating the dot product of the two normalized direction functions. If this product is very close to 1 or -1 we consider them parallel.


(The dot product of two normalized vectors is the cosine of the angle between them)

Being parallel is a necessary condition but not a sufficient one to determine if two edges are co-linear. We also need to check if they are on the same line. This is done by first calculating the vector from anyone of the two vertices in one edge to any one of the vertices in the other edge.

E3 is parallel to E1 but the light blue between vector is not parallel to E1

If the length of this vector is zero, then the chosen vertices coincide, and the edges are co-linear. If not we check the angle between this vector and the direction vector of one of the edges, and if this is is very close to 1 or -1, the edges are co-linear.

This means that for all edges we need to calculate the normalized direction vector and for all initially selected edges we need to calculate this between vector for all other edges.

A numpy based solution

Numpy can work efficiently on vast arrays of numbers and is bundled with Blender. By using Numpy we can avoid two notoriously slow things: Python loops and calling functions.

Our function looks like this (See the function colinear_edges() in this file):


def colinear_edges(selected: np.ndarray, indices, coords, threshold):
    colinear = np.zeros_like(selected)

    # calculate direction vectors for each edge
    edge_dirs = coords[indices[:, 1]] - coords[indices[:, 0]]
    edge_dirs = edge_dirs / np.linalg.norm(edge_dirs, axis=1)[:, np.newaxis]

    for e in selected.nonzero()[0]:
        # get the direction vector of the selected edge
        dir1 = edge_dirs[e]
        # check all other edges for colinearity
        angles = np.arccos(np.clip(np.dot(dir1, edge_dirs.T), -1.0, 1.0))
        parallel = (angles < threshold) | (np.abs(angles - np.pi) < threshold)
        v1 = coords[indices[e, 0]]
        w1 = coords[indices[:, 0]]
        # vector between start points
        between = w1 - v1
        # if the vector between start points is zero, they share a vertex, so colinear
        between_length = np.linalg.norm(between, axis=1)
        connected = between_length < 1e-6
        angles_between = np.abs(
            np.arccos(
                np.clip(
                    np.dot(dir1, (between / between_length[:, np.newaxis]).T), -1.0, 1.0
                )
            )
        )
        bparallel = (angles_between < threshold) | (
            np.abs(angles_between - np.pi) < threshold
        )
        # colinear if they are parallel and either share a vertex or the angle between the direction vector and the vector between start points is less than the threshold
        colinear |= (connected | bparallel) & parallel

    return colinear

Lets explain a few important steps.

The function is called with 4 arguments, a boolean array that indicates which edges are currently selected, an array with indices (2 for each edge) that indexes the third argument an array with vertex coordinates, and a threshold value we'll discuss later. All those arrays come from a Blender Mesh object and we will see how later in this article.

Line 5+6: Here we calculate all direction vectors between the edge indices in one go, and then normalize them in a single statement by dividing each vector by its norm (i.e. length).

Line 8-10: We loop over each selected edge and get its direction vector.

Line 12: Then we calculate the angles with all other vectors. This is done by calculating the dot product between the direction vector and all other direction vectors in on go (note that we need to transpose the array of vectors for this to work). We clip the dot products between -1 and 1 to guard against any floating point inaccuracies and then use the arccos() function to calculate the angle (Remember that the dot product represents the cosine of the angle between two vectors)

Line 13: then the angle is checked against the threshold and if smaller (or very close to π, because we don´t care in which direction the vectors are aligned) we deem them parallel.

Line 14-17: then we take a vertex v1 from the first edge and all vertices w1 for each other edge, and calculate the between vector.

Line 19+20: we calculate the length all those between vectors, and for all of them determine if this length is so small we consider the vertices coincident.

Line 21-27: then we calculate all angle between the direction vector and the between vectors in the same way we did before.

Line 28-30: we then determine if the between vectors are parallel with the direction vector (or anti-parallel, because we don´t care about that)

Line 32: Finally we combine the logic and say two edges are co-linear if the are parallel AND their chosen vertices are coincident OR the angle is near zero. The result is OR-ed into the colinear array because we do this for each edge that was initially selected and want to return the combined set.

Calling colinear_edges()

If we have a Blender Mesh object we can access obj.data.edges and obj.data.vertices. The select-colinear() function takes reference to those properties and uses then to efficiently retrieving all the indices, selected status and vertex coordinates with the foreach_get() method (Line 7-9). It stores them in arrays we have created first (Line 4-6).
foreach_get() expects flat arrays, so we reshape them into their expected shape where needed (Line 10+11), before we call the colinear_edges() function discussed earlier (Line 13).
The result is a flat array with the new selected status of each edge which we store in the select attribute of the mesh edges with the foreach_set() method (Line 14).
And finally we return the number of selected edges by counting all non-zero values (True is considered non-zero too, so this works fine for arrays of booleans).
def select_colinear(edges, vertices, threshold):
    n_edges = len(edges)
    n_vertices = len(vertices)
    indices = np.empty(2 * n_edges, dtype=int)
    coords = np.empty(3 * n_vertices, dtype=float)
    selected = np.zeros(n_edges, dtype=bool)
    edges.foreach_get("vertices", indices)
    edges.foreach_get("select", selected)
    vertices.foreach_get("co", coords)
    coords = coords.reshape((n_vertices, 3))
    indices = indices.reshape((n_edges, 2))

    colinear = colinear_edges(selected, indices, coords, threshold)
    edges.foreach_set("select", colinear)
    return np.count_nonzero(colinear)

Summary

Using the foreach_get() / foreach_set() lets us easily get access to mesh properties in bulk, which allows us to use Numpy to implement an algorithm that calculates co-linearity without Python loops (except for the loop of all initially selected edges).

In exchange for a modest increase in complexity we gain a lot of performance: Although your mileage may vary of course, I could easily (in < 0.1 second) select all co-linear edges when picking one edge in a default cube that was subdivided 500 times (= around 3.5 million edges). Fast enough for me 😀




New blenderaddons repo aimed at developers

 

I decided to create a new repository for my Blender add-ons. It is called blenderaddons-ng and aims to replace my old repo with a complete, Vscode based solution.

Goals

The primary goal for this new repo is not just to host any add-ons I write, but also provide an example of a complete development environment based on Vscode.

To facilitate this, I added the following features:

  • A DevContainer to isolate the development environment
  • A complete configuration to enable testing with pytest
  • Options to enable line profiling
  • GitHub actions for CI

A more complete write-up can be found on the GitHub page for the repo, but in short:

DevContainer

Based on Ubuntu and containing all necessary dependencies to develop and test Blender add-ons. It does not contain Blender but provides the bpy module standalone, so we can perform automated tests.


Pytest

We use pytest for automated testing as well as for on-demand testing in Vscode. The coverage and benchmark plugins are provided as well.

Line profiler

For those situations where we would like to take an in-depth look at the performance the line-profiler package is installed as well, and we provide example code so you can see how this can be used in such a way that you don´t have to alter code before distributing an add-on.

GitHub Actions

Upon each commit (and merged pull request) on GitHub all automated test are run an the result and coverage are updated in badges.


Blender Market, the end?

 


I decided to discontinue selling add-ons and books on Blender Market.

Sales have been declining for a while now, and I had set myself a hard minimum on the average monthly revenue, and that minimum was reached last month.

I have been active for more than 10 years on Blender Market, my Weightlifter add-on was first published in August 2014, so the decision does hurt a bit, but in the end several factors added up to this final decision: It is not just the declining income, but costs have been increasing too. Both the fees to Blender Market (which are fair, but are still increasing nevertheless), as well as the time investment in maintaining add-ons and providing support, are no longer in balance with the income. 

I was never in it for the money anyway, it was just a hobby, and by doing this I hope to be able to spend more time on things I enjoy more, like creating completely free add-ons, and using Blender for my personal artwork.

I am not completely gone from Blender Market though; I like modelling too, and I will continue adding some if I think that they are good.

Blender Market Fall Sale

 


To celebrate the Blender Conference there is a sale on at BlenderMarket ! The sale runs from Thursday 27 October - Sunday 30 October. 

This means serious discounts on participating products and of course my add-ons are on sale too, including Snap!

Check out BlenderMarket to see if that special product on your wish list now has an 'on sale' label.

BlenderMarket Black friday / Cyber Monday 2021


That whole Black Friday / Cyber Monday thing could be considered the pinnacle of consumerism, but why deny yourself a nice discount if you planned on purchasing something from BlenderMarket ? They are having a sale from Friday, November 26th 12:00am - Monday, November 29th 11:59pm CST
This means serious discounts on participating products and of course my add-ons are on sale too!

Check out BlenderMarket to see if that special product on your wish list now has an 'on sale' label.

Updating old Blender add-ons

The last couple of weeks I have been porting almost a hundred add-ons, scripts and snippets to Blender 2.9x and you would think that to be a lot of work. This proved to be not the case though, because the changes between 2.8x and 2.9x are actually fairly minimal and even scripts dating back all the way to 2.78 were fairly straightfoward to port.

This doesn't mean there were no changes but the fundamental concepts in the Blender python API, like data access and the way operators and panels work stayed the same. What changed were mainly additions (new operators, other new classes like all kinds of nodes), minor changes (optional arguments must now be passed using the keyword) and renaming (groups became collections, lamps became lights).

There were some exceptions though that had more impact, for example, the decision to require annotations for properties and force a naming convention onto classes that need to be registered (like Panel and Menu derived classes) force you to check every add-on because although not mandatory yet, it will be in the future.

Another change that required a bit of work was that the helper function to register all classes in a module has been removed. As we will see in the examples, an alternative is provided.

The biggest change has been in the OpenGL bindings, but although this has a lot of impact, only a minority of add-ons deal with OpenGL drawing.

Anyway, in the list of examples below, I have highlighted the changes, sorted roughly based on how many scripts it has affected (I think).

Required version in bl_info

bl_info["blender"] is no longer just a minimum but must list a version that is at least 2, 80, 0. Otherwise your add-on simply won't run, even if it were compatible.

Old:
bl_info = {
    "name": "Some Operator",
    "author": "Me, myself and I",
    "version": (1, 2, 3),
    "blender": (2, 78, 0),
    "location": "View3D > Object > Some Op",
    "description": "An operator doing someting",
    "category": "Special stuff"}
new:
bl_info = {
    "name": "Some Operator",
    "author": "Me, myself and I",
    "version": (1, 2, 4),
    "blender": (2, 92, 0),
    "location": "View3D > Object > Some Op",
    "description": "An operator doing someting",
    "category": "Special stuff"}

register_module() no longer exists

The way classes that need to be registered are dealt with has changed. Instead of registering everything in a module we need to register individual classes.

from bpy.utils import register_module, unregister_module

def register()
    register_module(__name__)

def unregister()
    unregister_module(__name__)

There is now a factory fumction for this(docs)

from bpy.utils import register_classes_factory  

classes = [Myoperator, VIEW3D_PT_mypanel]
register_classes, unregister_classes = register_classes_factory(classes)

def register():
    register_classes()

def unregister():
    unregister_classes()

properties are no longer assigned values but annotated

I never understood the benefits of this change but it affects almost every add–on

Old:

class Myoperator(bpy.types.Operator):
    bl_idname = 'mesh.myoperator'
    bl_label = 'Do something'
    bl_options = {'REGISTER', 'UNDO'}

    someprop = IntProperty(name="Some prop")

New: Note that the only visible difference is that we now use a colon (:)

class Myoperator(bpy.types.Operator):
    bl_idname = 'mesh.myoperator'
    bl_label = 'Do something'
    bl_options = {'REGISTER', 'UNDO'}

    someprop : IntProperty(name="Some prop")

add-on preferences class needs to be registered

This wasn't the case earlier, but it is as simple as adding it to your list of classes

class MyAddonPrefs(bpy.types.AddonPreferences):
    bl_idname = __name__  # give settings the name of the python module

    somepref : IntProperty(name="Some pref")

    def draw(self, context):
        layout = self.layout
        layout.prop(self, "somepref")
        
classes = [Myoperator, MyAddonPrefs]
register_classes, unregister_classes = register_classes_factory(classes)

Python 3.7

The visible bit of moving to a newer python version is mainly the use of the @ operator instead of *

    mat = ob.matrix_world.inverted()
    for vert in ob_verts:
        world_coordinates = mat * vert.co
    mat = ob.matrix_world.inverted()
    for vert in ob_verts:
        world_coordinates = mat @ vert.co

This is also applies to multiplying two vectors: * is now element-wise multiplication, @ gives the dot product (this is in line with Numpy)

Of course you get all the added benefits from Python 3.7 as well although most are not relevant for add-on development per se.

renamed built-in icons

Not only have the icons been redesigned, many new ones are available and quite a few are removed. The list of change sis to big to list here but if you want an overview of the changes since 2.78 I have made a page

Name change for some types of objects

Lamp has become Light and Group has become Collection. Collections offer a lot more functionality too of course but at its simplest level a collection is just a group.

mandatory keyword parameters

Functions that have optional parameters not must use a keyword. Some notable examples

setting text in an area header,
for example when displaying the status of a modal operator (docs)

context.area.header_text_set(text="something")
context.area.header_text_set(text=None)

And the label() function in a layout (docs), which is used often in panels and operator draw functions.

layout.label(text="something")

layout.split()

Talking about layout, a minor change is that the percentage argument renamed to factor (docs)

layout,split(factor=0.40)

A scene has now a separate cursor attribute

See here

x,y,z = context.scene.cursor.location

Vertex colors are now RGBA

But the Color() constructor always expects 3 values (docs)

Which means you cannot assign a Color object to a .color attribute in a vertex color layer.

some methods expect a dependency graph

The ray_cast() function for example

so instead of

scene.ray_cast(origin, direction )
scene.ray_cast(context.window.view_layer.depsgraph, origin, direction )

The mathutils.bvhtree.BVHTree.FromObject function also needs it.

user preferences are now called preferences

Still accessible through the context but now as ,code>context.preferences instead of context.user_preferences

Note that this breaks stored preferences as well since they might contain references to this attribute in the set itself (preferences are stored as executable python)

The active object is accessed differently

scene.objects.active is no longer available, use context.active_object (docs)

To set an object as the active object you need to do something different as well

context.view_layer.objects.active = myobj

Selecting objects using setters/getters

instead of

ob.select = True
ob.select_set(True)

More here

There is a select_get() as well. This is consistent with selection functions in BMeshes for verts, edges, etc. (docs)

Menus have been shuffled

For example the Object and Mesh menus were part of the Infobar but are now part of the 3d view area.

So, for example INFO_MT_mesh_add menu is now called VIEW3D_MT_mesh_add

The info bar is no longer present by default, the part at the very top (with File, Edit, Render, ...) is now called the TOPBAR

attributes have changed on the Subsurface modifier

Notably the way to perform uv subdivision as well see more here

mod.use_subsurf_uv = True

is no longer needed.

OpenGL bindings are completely changed

bgl does no longer support direct mode at all (not even as an option). Which means everything has to be done replaced by shaders. more here

Blender Add-on Cookbook 2.93 Edition

I am pleased to announce that I just released an updated version of my Blender Add-on Cookbook


This new edition has been updated and tested on Blender 2.92 & 2.93 (beta at the time of writing).

It does not contain any drastically new things, but it has been revised to take into account all the small things that have changed in the Blender Python API since version 2.78, most screenshots have been updated to reflect the 2.9x interface, all links have been verified/updated, and the updated code has been placed in its own Github repository.


The Blender Add-on Cookbook - 2.93 Edition is available on Blender Market.

Progress indicator updated

A long time ago I created a progress indicator that blended nicely with the info header. However, as someone pointed out, we nowadays have the view3d header at the top of the screen and the python API has also changed a bit. So, time for an update :-)


Code

In the end, I didn't have to change all that much. The main difference is that we now replace the draw() method of the VIEW3D_HT_tool_header instead of the INFO_HT_header and in the update() function we now make sure we tag all VIEW3D areas for redraw instead of the INFO areas.

Some minor changes were needed too, but this affected mainly the test operators: wm.event_timer_add() now has mandatory keyword for optional arguments and of course the way we register operators has changed.

The updated code is available from GitHub.


Creating add-ons for Blender 2.93 edition

I am pleased to announce that I just released an updated version of Creating add-ons for Blender.


This new edition has been updated and tested on Blender 2.93 (beta at the time of writing) and will work on 2.92 too.

I does not contain any drastically new things compared to the previous edition, but it has been revised to take into account all the small things that have changed in the Blender Python API that would block a beginner.

Creating add-ons for Blender is available on Blendermarket.

The book is provided in .epub, .mobi and .pdf versions.  The .epub will be readable on most devices, and the .mobi on newer Kindles but the .pdf is probably looks best.

NewGrowth interactive tree modeling for Blender: new version



NewGrowth, the Blender add-on that lets you interactively paint 3d trees, got a number of nice new features:

Many drawing options can now be changed while drawing

No need to choose them before you start creating a tree.
This includes the branch segment length and the kill distance. This means you can now create bushier and more sparse sections in the same tree.

Tree finalization now sports a Curve option

This will generate a curve based trunk instead of a mesh based one. This results in a superior quality trunk and is faster as well, so this is now the default. The old methods are still there and because this is all parameterized this means you can convert trees created in older versions of NewGrowth to the new curve model and back if you want.

Twig generation

You now have the option to add twigs consisting of three leaves instead of just a single leaf, which will quickly enhance the density of your tree.

Minor tweaks and bug fixes

Too many to list them all here but you now have the option to add some root and branch flare and if yo draw in rendered view instead of solid view the interactive tree is no longer all black.

The fully illustrated manual has been updated as well to reflect all the changes and the add-on is now tested on Blender 2.83 LTS




IDMapper updated for Blender 2.83 LTS


I just finalized testing the IDMapper add-on and it's now ready for Blender 2.83 LTS.

IDMapper simplifies creation and editing of vertex color layers that can be used as ID-maps in texturing software like Substance Painter or Quixel.

It aims to reduce the time it takes to create an ID-map significantly, especially for complex hard surface models.

It uses powerful heuristics to create an ID-map from scratch and lets you interactively adjust the results. It offers options to use existing information, like uv-seams, but can also intelligently assign the same color to similar mesh parts.

There are no functional changes in this release but I did perform a small bug fix. However, if you don't encounter problems in your workflow, there is no pressing need to upgrade.

The new version is available on BlenderMarket.

BCon19 BlenderMarket sale

It is almost a tradition by now: BlenderMarket will have a sale during the Blender Conference.
This means serious discounts on participating products and of course my add-ons are on sale too!

Check out BlenderMarket to see if that special product on your wish list now has an orange 'sale' label.

New version of WeightLifter for Blender 2.80



A new version of WeightLifter (20190728) has been released on BlenderMarket.

I has some bug-fixes to keep it compatible to the latest release candidate but more importantly it sports an updated manual and a new feature to combine baked weight maps into one. This feature is useful if you use the maps as density maps in a particle instance and want to prevent frame-to-frame jittering for example. It is illustrated in this video


BlenderMarket spring sale

As there were some technical difficulties BlenderMarket has extended the spring sale until Saturday.

From May 15. til May 18 May 19. BlenderMarket will have its annual spring sale.

Many products will be 25% off their regular price, including my add-ons. So if you had a purchase in mind, this might be fine opportunity to do so!
a full list of everything on sale is available as well.

PlaneFit: Blender add-on to fit a plane through a set of vertices

After all the updates to my BlenderMarket add-ons, it is time to spend some time on other add-ons again.

Fitting a plane

Fitting a plane through a collection of selected vertices might be useful in all sort of scenarios. Currently you can snap a plane to a face with Blender's snap tools but if we want to fit a plane to a large set of vertices, we need to resort to basic linear algebra to do this in a fast and robust manner. And because linear algebra is not everybody's favorite subject I created a small add-on.

After installation the add-on is available from the Add menu if you have a mesh in edit mode. Clicking on it will create a square plane that is fitted through all selected vertices. A size option lets you scale this new plane interactively. The result might look something like this:
Note that currently we not check if the minimum of 3 vertices are selected, you get a an error if you try.

Availability

As usual the add-on is available from my GitHub repository. (right-click on the link and select Save As .. to store the .py file somewhere where you can find it again )

Source code

Both the linear algebra and the code to add a plane to an exiting mesh isn't completely trivial, so let me highlight the interesting bits:

Fitting a plane

The fitting code is quite short and straight forward:
import numpy as np

def planeFit(points):
    ctr = points.mean(axis=0)
    x = points - ctr
    M = np.cov(x.T)
    eigenvalues,eigenvectors = np.linalg.eig(M)
    normal = eigenvectors[:,eigenvalues.argmin()]
    return ctr,normal
Any book on linear algebra can give you a better explanation but with a fair bit of hand-waving it can be explained as follows: points is a list of 3d-vectors. ctr will be the midpoint of the plane which is the mean of each of the x, y and z-components.

In line 5 - 7 we calculate the eigen vectors of the point cloud. It is a 3d cloud so we will get 3 eigen vectors and 3 corresponding eigenvalues. Each combination of eigen value and eigen vector can be interpreted as a direction vector and a measure of how well it explains the spread of the points. This means that if the points lie roughly in a plane, the two biggest eigen vectors lie in the best fit plane while the smallest one will be the normal (all eigen vectors are perpendicular). And indeed this smallest one is the one we get in line 8.

Adding a plane an existing mesh

Now adding a plane consisting of four vertices in a square would be quite simple, yes? Ehh, no: the Mesh object has a from_pydata() function but it only works correctly when adding to an initially empty mesh. So lines 15 - 27 essentially replicate what that function does: create 4 vertices, and 4 loops, compose a polygon out of it and add it to the mesh as well. We could have worked with a BMesh representation but then we would not have an efficient way to retrieve all vertex coordinates, something we really need when working with thousands of vertices.

The code in lines 7 - 12 is a very efficient way to get the coordinates of selected vertices into a Numpy array: we create Numpy arrays to hold all the vertex coordinates and the selected status, then get all of them with the fast built-in function foreach_get(). verts[selected] then leaves us with an array of just the coordinates of selected vertices, which we pass to our planeFit() function we saw earlier.

We then create two vectors perpendicular to our normal and use them to construct four vertex coordinates.

         def execute(self, context):
  bpy.ops.object.editmode_toggle()
  me = context.active_object.data
  count = len(me.vertices)
  if count > 0:  # degenerate mesh, but better safe than sorry
   shape = (count, 3)
   verts = np.empty(count*3, dtype=np.float32)
   selected = np.empty(count, dtype=np.bool)
   me.vertices.foreach_get('co', verts)
   me.vertices.foreach_get('select', selected)
   verts.shape = shape
   ctr, normal = planeFit(verts[selected])
   dx, dy = orthopoints(normal)  # definition of orthopoints() not shown
   # can't use mesh.from_pydata here because that won't let us ADD to a mesh
   me.vertices.add(4)
   me.vertices[count  ].co = ctr+dx*self.size
   me.vertices[count+1].co = ctr+dy*self.size
   me.vertices[count+2].co = ctr-dx*self.size
   me.vertices[count+3].co = ctr-dy*self.size
   lcount = len(me.loops)
   me.loops.add(4)
   pcount = len(me.polygons)
   me.polygons.add(1)
   me.polygons[pcount].loop_total = 4
   me.polygons[pcount].loop_start = lcount
   me.polygons[pcount].vertices = [count,count+1,count+2,count+3]
   me.update(calc_edges=True)

  bpy.ops.object.editmode_toggle()
  return {'FINISHED'}

Blender Market Black Friday - Cyber Monday sale

Goods news for cost conscious Blenderheads: this thanksgiving weekend Blender Market will host the yearly Black Friday - Cyber Monday sale!




I will participate with my add-ons, including the popular IDMapper add-on. So if you want to save significantly on WeightLifter, SpaceTree, or IDMapper (video), head over to my shop on Blender Market this weekend. Of course many other creators will be participating as well so you might want to shop around a bit more :-)

[For Europeans: remember Blender Market runs on Chicago time, so don't start shopping too early next Friday :-) ]

ColorRampPicker, a new Blender add-on

Got this idea from Substance Designer which offers an option to sample a whole palette of colors from a reference image for their gradient node. Now you can do the same for Blender's color ramp node. With a color ramp node selected, just go to the Node menu and select Color Ramp Picker. An eye dropper will appear and you can sample from anywhere within your Blender window by clicking and dragging. Unfortunately it is not possible to sample outside the Blender window so you should have your reference image loaded in Blendeś uv-image editor.

I have also made a small video that illustrates the proces:

Availability

You can download the add-on from GitHub. (click right on the Raw button and save the python file anywhere you like and then in Blender select File->User preference, Add-ons, Install from file... Don't forget to enable the add-on as well after installing it.)

check this update article for the latest version.

If you would like to write add-ons yourself, you might want to take a look a my books on BlenderMarket.

Alternative

After publication I learned about a similar and much more versatile tool to generate (and sample) gradients. Check this BlenderArtists thread to learn more.

Add-on: Selecting similar vertices

If you select the Similar sub menu when in vertex edit mode Blender already offers a few options to extend your current selection of vertices.

And although useful I frequently find myself in a situation where the available options are not sufficient. Especially in hard edge modeling of objects with identical sub parts I often want to be able to select vertices that not just share the same number of faces but where the these shared faces have something else in common. For example in the object shown below we have two different collections of bumps but the is no way to select just all the spiky tips in one go.

The simple add-on I present here fills this gap: it checks not only if vertices have the same number of surrounding faces but also whether the average angle between the face normals and the vertex normals is similar. This will allow you to distinguish between flat and spiky. The amount of similarity can be tweaked to allow for round-off errors or non smooth meshes.

Availability and usage

After downloading, installing and enabling the add-on from GitHub, the new selection option is available in vertex edit mode from the Select -> Select_similar -> Neighborhood

Black Friday - Cyber Monday at Blender Market

Goods news for cost conscious Blenderheads: from november 25th - november 28th Blender Market will host the yearly Black Friday - Cyber Monday sale!




I will participate will all my products, including my new IDMapper add-on. So if you want to save 25% on WeightLifter, SpaceTree, IDMapper (video) or one of my books, head over to my shop on Blender Market this weekend. Of course many other creators will be participating as well so you might want to shop around a bit more :-)

[For Europeans: remember Blender Market runs on Chicago time, so don't start shopping too early next Friday :-) ]

New Book: Creating add-ons for Blender

Yeah, it's there! I am happy to announce my new, short-and-sweet, information packed book on creating add-ons for Blender.

It might be small, but it is beautifully formed :-) You might want to sample it (or buy of course) from the location below:

A sample of the book is also available as pdf