public class HiveGenericUdtf extends org.apache.spark.sql.catalyst.expressions.Generator implements HiveInspectors, scala.Product, scala.Serializable
Generator. Note that the semantics of Generators do not allow
Generators to maintain state in between input rows. Thus UDTFs that rely on partitioning
dependent operations like calls to close() before producing output will not operate the same as
in Hive. However, in practice this should not affect compatibility for most sane UDTFs
(e.g. explode or GenericUDTFParseUrlTuple).
Operators that require maintaining state in between input rows should instead be implemented as user defined aggregations, which have clean semantics even in a partitioned execution.
HiveInspectors.typeInfoConversions| Constructor and Description |
|---|
HiveGenericUdtf(HiveFunctionWrapper funcWrapper,
scala.collection.Seq<String> aliasNames,
scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> children) |
| Modifier and Type | Method and Description |
|---|---|
scala.collection.Seq<String> |
aliasNames() |
scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> |
children() |
scala.collection.TraversableOnce<org.apache.spark.sql.catalyst.expressions.Row> |
eval(org.apache.spark.sql.catalyst.expressions.Row input) |
HiveFunctionWrapper |
funcWrapper() |
String |
toString() |
dataType, makeCopy, nullable, outputc2, childrenResolved, eval$default$1, f1, f2, foldable, i1, i2, n1, n2, references, resolvedapply, argString, asCode, collect, fastEquals, flatMap, foreach, generateTreeString, getNodeNumbered, map, mapChildren, nodeName, numberedTreeString, otherCopyArgs, simpleString, stringArgs, transform, transformChildrenDown, transformChildrenUp, transformDown, transformUp, treeString, withNewChildreninspectorToDataType, javaClassToDataType, toInspector, toInspector, unwrap, wrap, wrap, wrapperForpublic HiveGenericUdtf(HiveFunctionWrapper funcWrapper, scala.collection.Seq<String> aliasNames, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> children)
public HiveFunctionWrapper funcWrapper()
public scala.collection.Seq<String> aliasNames()
public scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Expression> children()
children in class org.apache.spark.sql.catalyst.trees.TreeNode<org.apache.spark.sql.catalyst.expressions.Expression>public scala.collection.TraversableOnce<org.apache.spark.sql.catalyst.expressions.Row> eval(org.apache.spark.sql.catalyst.expressions.Row input)
eval in class org.apache.spark.sql.catalyst.expressions.Generatorpublic String toString()
toString in class org.apache.spark.sql.catalyst.trees.TreeNode<org.apache.spark.sql.catalyst.expressions.Expression>