class SGFXAPI::VertexElement

This class describes an element type of each entry in a VertexBuffer.

VertexBuffers consist of a series of entries, each of which may have several elements. As an example would be a vertex buffer might contain an entry for each vertex containing (normal, color, texture coordinates). A VetexDeclaration describes the data layout of these entries, and a VertexElement describes each element in the VertexDeclaration.

Public Functions

VertexElement(VertexDataSemantic semantic, VertexDataType src_type, int dimension, const std::string &name, bool normalized, GPUVertexDataType dst_type)

Constructs a VertexElement.

Parameters
  • semantic -

    the “point” of the element. I believe this is mostly legacy, before using named elements; I’m not exactly sure what opengl does with this information, and I sometimes abuse them to mean the wrong thing to pass in data that I want.

  • src_type -

    the base type of the source array; that is the array that will be copied from the CPU, if any.

  • dimension -

    each element is a vector of size [1,4]; so a “color” might be of src_type or dst_type UNSIGNED_SHORT and of dimension 3. A dimension of 1 would mean a scalar. dimension must be at least 1 and at most 4.

  • name -

    the name of the vertex element; it should be descriptive, like “color”, “normal”, or “extra-data1” etc. and no two names should conflict for the all the vertex-data (possibly throughout multiple VertexBuffers and VertexDeclarations in the same mesh) when rendering.

  • normalized -

    if the destination data is a floating point, and the source data is of integer type, setting this to true will automatically convert the data to be in the range of [0,1] for unsigned types, or [-1,1] for signed types. Specifically, the number will be mapped from the full range of the source type to the normalized floating point range.

  • dst_type -

    the type to convert to/store on the GPU; defaults to the same type on the CPU; opengl itself will sometimes unexpectedly convert it to something else first, even if your shader is expecting the same type as the CPU, and lose information as a result - unless you are careful. This library defaults to keeping them the same (no conversion) unless specified by changing this argument. See http://www.informit.com/articles/article.aspx?p=2033340&seqNum=3

int SizeBytes() const

Calculates the size, in bytes, of this element, taking the type and dimension into account.

Return
The size in bytes.

int SizeDimension() const

The dimension.

Return
The dimension.

VertexDataSemantic Semantic() const

The semantic of this VertexElement.

Return
The VertexDataSemantic.

VertexDataType SrcType() const

The type of the source (CPU-side) data.

Return
The VertexDataType.

GPUVertexDataType DstType() const

The type of the gpu data.

Return
The VertexDataType.

bool Normalized() const

If the source type is an integer-like type, and the target type is a flaot-like type, then the conversion will optionally “normalize” the data; where “normalize” here means to make each float be in the range [0,1] (for unsigned source) or [-1,1] (for signed source), and the mapping is SOURCE_TYPE_MIN will be the lower part of the range, and SOURCE_TYPE_MAX will be the upper part of the range.

See docs on glVertexAttribPointer in the section on normalized.

Return
true if the conversion should normalize the data, false otherwise.

const std::string &Name() const

The name of the VertexElement.

Return
The name.

std::string ToString() const

Returns a string of format “(semantic: (semantic), name: (name), type: (type), count: (num))”.