Note: Even just help in rephrasing this to strip away a lot of the details to a more streamlined algorithm programming problem regarding sets would be appreciated. Although possibly any generalization to remove the extra structure would make it harder. Ultimately though, it feels like some kind of dynamic programming or recursive approach will be needed, but I cannot figure out how to set it up appropriately for that.
The inputs are the following:
- V, a set of variables from the rational numbers
- P = {p_i}, a set of homogeneous polynomials in the variables V with integer coefficients.
- a well-ordered monomial order: a concrete way to order monomials in a polynomial. In particular this means:
- for any monomials u, v, w: if u <= v then wu <= wv.
- for any monomial u, 1 <= u.
There are some additional guarantees on P, in relation to the monomial order, which can be useful.
Let R be the set of finite polynomials that can be built with the variables V and rational coefficients. So P is a subset of R.
Let S be the set of polynomials "generated" by P. That is S contains all polynomials that can be built from the basis like this: Sum of q_i p_i, for any q_i in R, p_i in P.
The special guarantee of P is as follows:
- Let d, be the maximum degree of the polynomials in P.
- For any s in S with degree <= d, then the leading monomial of s (with respect to the monomial ordering) will be equal to, or divisible by, one of the leading monomials of the polynomials in P.
This makes P a particularly "nice" basis, at least up to degree d.
Desired output:
- If possible: find a polynomial f in S of degree <= d, such that it factors f=gh, while neither factor g nor h is in S. (Otherwise we could trivially choose any polynomial q in R and p in P, and pq would be in S and be factorable.)
- Otherwise output "None"
I'm not quite sure how to approach this problem. One rough idea I had that I couldn't fully flesh out is:
Note that the leading monomial of f must be one of the leading monomials in P or some multiple of them. As we get closer to degree d this gets more and more restrictive. This cuts down on the possible leading monomials of f. Making a choice to test, we have the requirement f = (leading term f + unknown) = (term a + unknown c)(term b + unknown d). So the leading term of f must split such that neither leading "term a" nor "term b" would be in S given the guarantees. This test will probably eliminate more potential leading terms of f. If one looks possible, then based on the possible ways that leading term could be formed from polynomials in P, somehow try to "work backwards" and build up the terms in "unknown c" and "unknown d" using the fact that they need to multiply with "term a" or "term b" to give the left over terms in f from how we formed it in P. I'm not quite sure how to implement that step, probably some kind of recursion?, but hopefully it would lead to a solution or a contradiction. In the later case, moving onto the next possible leading term, until a solution is found, or a contradiction.
I have a feeling this problem has already been studied, due to what I'd like to use this algorithm for. Basically, given a system of polynomial equations, one approach to finding solutions is to first calculate a Groebner basis given some monomial ordering. Sometimes this explodes and there are too many basis polynomials. In the case of homogeneous polynomials, we can find a "truncated" Groebner basis by only calculating it up to a given degree. Then (it is my understanding, please correct me if I'm accidentally making much stronger assumptions), if we can find a factorable polynomial f=gh such that f in S but g,h are not in S, then we have evidence of two separate "solution components". We can choose to satisfy g or h to simplify our equations ... or alternatively, know that there are likely many other polynomials in S with factors of g, and knowing this, we can pull out even more details about some of the "solution components" by guiding the search for more factors.